Model Catalog

Every model currently routable through Northern Inference. Copy the Routing key into your OpenAI-compatible client (Cursor, Aider, Continue, Zed, LibreChat, LangChain, …) and POST to /v1/chat/completions. Anthropic-native clients work too: same base URL, POST to /v1/messages with native model IDs (claude-sonnet-4-5-20250929). That's the exact string you put in the model: field.

Loading…
How to read jurisdiction: CA Canadian data residency (Bedrock ca-central-1 / Azure Canada East). US US-hosted (OpenAI, Anthropic direct, Azure GlobalStandard). Multi Cross-region routing (Bedrock global.* inference profiles). Region-suffixed model IDs (-ca, -us, -global) make the jurisdiction explicit in your dropdown.
Loading models from /api/billing/models…