feat: tiered LLM providers (fast/heavy) #43
Reference in New Issue
Block a user
Delete Branch "feat/tiered-llm-providers"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Summary
Adds tier-based LLM routing so fast local models (vLLM, Ollama) handle structured tasks while cloud models (Gemini, Anthropic) are reserved for heavy reasoning. Single-provider configs continue to work via fallback. - Tier type + ProviderRegistry with assignTier/getProvider/fallback chain - Multi-provider config format: { providers: [{ name, type, tier, ... }] } - NamedProvider wrapper for multiple instances of same provider type - Setup wizard: Simple (legacy) / Advanced (fast+heavy tiers) modes - Status display: tiered view with /llm/providers endpoint - Call sites use getProvider('fast') instead of getActive() - Full backward compatibility with existing single-provider configs Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>