AI Providers
Sadie routes every LLM call through a tier-aware resolver. Chat, wiki compilation, brief generation, and policy judging each have a default tier, and each tier resolves to a concrete provider and model through the same precedence chain.
The chain is designed so that a user with their own API key always wins, a user preference without a key falls back to the deployment’s env, and the local stub only ever runs in development.
Resolution order
Section titled “Resolution order”When Sadie needs a model for a task, getProviderForTask(taskClass, userPrefs) walks this list and stops at the first hit:
- User-supplied API key. If the user has set their preferred provider in Settings and saved their own key for it, that key is used. One call per user, cached by key suffix.
- User’s preferred provider with the deployment env key. If the user chose Anthropic or OpenAI in Settings but did not bring a key, Sadie uses the deployment’s
ANTHROPIC_API_KEYorOPENAI_API_KEYfor that provider. - Tier env vars. If the task’s tier has
AI_FRONTIER_*_PROVIDERandAI_FRONTIER_*_MODELset, that provider is instantiated with the matching env key. This lets a deployment route tier 1 calls (wiki_lint, source_extraction, preference_normalization) to a cheap model while tier 2 calls (brief_generation, grounded_chat) use a workhorse. - Raw
ANTHROPIC_API_KEYorOPENAI_API_KEY. If no tier vars are set but a raw key exists, Sadie uses the Anthropic key first (canonical), then OpenAI, with a sensible default model per tier. - Local stub (dev only). If no real credentials are configured and
NODE_ENV !== "production", the local stub serves deterministic Sadie-voice templates. Chat explicitly refuses the stub and returns aMissingLlmProviderErrorso self-hosters cannot ship without real credentials.
Task tiers
Section titled “Task tiers”Each task class maps to a tier. The defaults live in packages/ai/src/model-router.ts:
- Tier 1 (cheap frontier, high-volume maintenance): source_extraction, candidate_page_matching, wiki_patch_draft, contradiction_detection, today_card_features, preference_normalization, wiki_lint
- Tier 2 (synthesis, normal interactive): wiki_page_create, today_card_copy, brief_generation, grounded_chat, studio_rewrite
- Tier 3 (deliberate reasoning): contradiction_resolution, deep_synthesis_chat
Self-hoster env vars
Section titled “Self-hoster env vars”For a self-hosted deployment, the minimum configuration is one raw vendor key:
ANTHROPIC_API_KEY=sk-ant-...# orOPENAI_API_KEY=sk-...For tier routing, set both provider and model per tier:
AI_FRONTIER_SMALL_PROVIDER=anthropicAI_FRONTIER_SMALL_MODEL=claude-haiku-4-5-20251001
AI_FRONTIER_WORKHORSE_PROVIDER=anthropicAI_FRONTIER_WORKHORSE_MODEL=claude-sonnet-4-6
AI_FRONTIER_REASONING_PROVIDER=anthropicAI_FRONTIER_REASONING_MODEL=claude-opus-4-7To forbid the local stub even in development, set SADIE_ALLOW_LOCAL_AI_STUB=0. This makes calls fail loudly instead of silently returning template text.