Aider is a CLI pair-programmer. It uses LiteLLM internally, so any OpenAI-compatible endpoint works.
Setup
pip install aider-chat
export OPENAI_API_KEY=ni_live_YOUR_KEY_HERE
export OPENAI_API_BASE=https://northerninference.ca/v1
Run
# Use Claude Sonnet 4.5 via NI (Canadian residency by default)
aider --model openai/anthropic/claude-sonnet-4.5
# Use the provider tier for max speed
aider --model openai/anthropic/claude-sonnet-4.5 \
--extra-body '{"privacy_tier": "provider_api"}'
# With a custom prompt-caching setup
aider --model openai/anthropic/claude-sonnet-4.5 \
--cache-prompts
Note on the openai/ prefix
Aider's convention is <provider>/<model>. Because we're telling aider to treat NI as OpenAI-compatible (provider=openai), you need to prefix the model with openai/, which ends up making aider send anthropic/claude-… as the model string to NI — exactly what NI expects.
Verify
aider --message "ping"
Then check portal → Usage for the request.