Loading models from /api/billing/models…
Every model currently routable through Northern Inference. Copy the Routing key into your OpenAI-compatible client (Cursor, Aider, Continue, Zed, LibreChat, LangChain, …) and POST to /v1/chat/completions. Anthropic-native clients work too: same base URL, POST to /v1/messages with native model IDs (claude-sonnet-4-5-20250929). That's the exact string you put in the model: field.