Open WebUI

Use Northern Inference with Open WebUI

Open WebUI is a self-hostable ChatGPT-style frontend. It supports OpenAI-compatible backends natively.

Docker compose snippet

services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
      - "3000:8080"
    environment:
      OPENAI_API_BASE_URL: https://northerninference.ca/v1
      OPENAI_API_KEY: ni_live_YOUR_KEY_HERE
      # Optional: enable multiple endpoints (e.g. NI + local Ollama)
      # OPENAI_API_BASE_URLS: "https://northerninference.ca/v1;http://ollama:11434/v1"
      # OPENAI_API_KEYS: "ni_live_YOUR_KEY;ollama"
    volumes:
      - open-webui:/app/backend/data

Open http://localhost:3000, sign in, go to Settings → Admin Settings → Connections. The NI connection should be live; Refresh to pull the model list.

Per-user NI keys (multi-tenant)

Open WebUI has a Settings → Connections tab for each user. If you host Open WebUI internally and each user has their own NI key, let them paste it there instead of setting a global key in the compose file. NI will bill each user's team separately.

Privacy tier selection

Open WebUI doesn't have a UI for extra_body. Use NI portal-issued keys with baked-in default tiers, or use Open WebUI's Model Params → Custom JSON field (if available in your version) to set:

{"privacy_tier": "managed_canadian_cloud"}

Source: tests/user_run_tests/integrations/openweb_ui.md. Spot a problem? Let us know.