Directories containing .md / .mdx files to ingest as the knowledge
corpus.
init() will throw if these paths resolve to zero readable markdown
sections, because a successful router init should imply a non-empty corpus.
Optional confidenceMinimum confidence threshold for accepting a classification result. If confidence falls below this, the router may escalate to a higher tier.
0.7
Optional classifierLLM model for the classifier.
'gpt-4o-mini'
Optional classifierLLM provider for the classifier.
'openai'
Optional maxMaximum tier the classifier may assign.
3
Optional embeddingEmbedding provider name.
'openai'
Optional embeddingEmbedding model identifier.
'text-embedding-3-small'
Optional generationLLM model for T0/T1 generation.
'gpt-4o-mini'
Optional generationLLM model for T2/T3 generation (deep).
'gpt-4o'
Optional generationLLM provider for generation.
'openai'
Optional graphWhether to enable GraphRAG-based retrieval for tier >= 2 queries. Requires a configured GraphRAG engine.
true
Optional deepWhether to enable deep research mode for tier 3 queries. Research mode performs iterative multi-pass retrieval and synthesis.
true
Optional conversationNumber of recent conversation messages to include as context for classification and generation.
5
Optional maxMaximum estimated tokens to allocate for documentation context.
4000
Optional cacheWhether to cache query results.
true
Optional availableOptional tool/capability names exposed to the classifier prompt so it can reason about what the runtime can actually do.
[]
Optional graphOptional host-provided graph expansion callback.
Provide this to replace the built-in placeholder graphExpand() branch
with a real GraphRAG or relationship-expansion implementation.
Optional rerankOptional host-provided reranker callback.
Provide this to replace the built-in lexical heuristic reranker with a provider-backed or cross-encoder reranker.
Optional verifyEnable automatic citation verification on deep research responses. When true, moderate-depth queries also verify citations. Default: false (only deep research verifies automatically).
Optional deepOptional host-provided deep research callback.
Provide this to replace the built-in placeholder research branch with a
real multi-source research runtime. The sources argument receives
normalized research-source hints such as web, docs, or media,
not raw classifier retrieval labels.
Optional onHook called after classification completes. Receives the ClassificationResult for consumer integration.
Optional onHook called after retrieval completes. Receives the RetrievalResult for consumer integration.
Optional apiOptional API key override for classifier and generator LLM calls.
When omitted, QueryRouter prefers OPENAI_API_KEY and falls back to
OPENROUTER_API_KEY with the OpenRouter compatibility base URL.
Optional baseOptional base URL override for classifier and generator LLM providers.
When omitted, QueryRouter auto-selects the OpenRouter compatibility URL
only when OPENROUTER_API_KEY is being used implicitly.
Optional embeddingOptional API key override for embeddings only.
When omitted, embeddings fall back to apiKey, then OPENAI_API_KEY,
then OPENROUTER_API_KEY.
This is useful when generation uses an OpenAI-compatible endpoint like
OpenRouter but embeddings should stay on a direct OpenAI key.
Optional embeddingOptional base URL override for embeddings only.
When omitted, embeddings inherit baseUrl unless embeddingApiKey is
explicitly set, in which case the embedding path assumes the provider's
default endpoint. If neither override is set and QueryRouter falls back to
OPENROUTER_API_KEY, it automatically uses the OpenRouter compatibility
URL for embeddings as well.
Optional githubConfiguration for background GitHub repository indexing.
When provided, the router will asynchronously index GitHub repos after
init() completes and merge the resulting chunks into the corpus.
Optional strategyRetrieval strategy configuration for the HyDE-aware query router.
Controls how the classifier selects between none, simple, moderate
(HyDE), and complex (HyDE + decompose) retrieval strategies.
QueryRouterStrategyConfig
Optional includeLoad bundled platform knowledge (tools, skills, FAQ, API reference,
troubleshooting) into the corpus during init().
When enabled, the router ships with instant knowledge about every AgentOS capability — no external docs required for platform questions.
true
Public constructor configuration for the QueryRouter pipeline.
knowledgeCorpusis required. All other fields are optional and default to the values in DEFAULT_QUERY_ROUTER_CONFIG.Example