Class QueryGenerator

Builds tier-appropriate prompts and generates LLM answers.

The generator selects a model (standard vs. deep) based on the query tier, constructs a system prompt with optional documentation context and research synthesis, then delegates to generateText for the actual LLM call.

Example

const gen = new QueryGenerator({
model: 'openai:gpt-4.1-mini',
modelDeep: 'openai:gpt-4.1',
provider: 'openai',
});

const result = await gen.generate('How does auth work?', 1, chunks);
console.log(result.answer);

Constructors

Methods

Constructors

Methods

  • Generate an answer for the given query at the specified complexity tier.

    Parameters

    • query: string

      The user's original question.

    • tier: QueryTier

      The classified complexity tier (0–3).

    • chunks: RetrievedChunk[]

      Retrieved documentation chunks, sorted by relevance.

    • Optional researchSynthesis: string

      Optional research narrative (T3 only).

    Returns Promise<GenerateResult>

    A promise resolving to the generated answer, model used, and token usage.