Class RetrievalAugmentor

Implements

Orchestrates the RAG pipeline including ingestion, retrieval, and document management.

Implements

Constructors

Methods

  • Register a reranker provider with the RerankerService.

    Call this after initialization to add reranker providers (e.g., CohereReranker, LocalCrossEncoderReranker) that will be available for reranking operations.

    Parameters

    • provider: IRerankerProvider

      A reranker provider instance implementing IRerankerProvider

    Returns void

    Throws

    If RerankerService is not configured

    Example

    import { CohereReranker, LocalCrossEncoderReranker } from '@framers/agentos/rag/reranking';

    // After initialization
    augmentor.registerRerankerProvider(new CohereReranker({
    providerId: 'cohere',
    apiKey: process.env.COHERE_API_KEY!
    }));

    augmentor.registerRerankerProvider(new LocalCrossEncoderReranker({
    providerId: 'local',
    defaultModelId: 'cross-encoder/ms-marco-MiniLM-L-6-v2'
    }));
  • Register an LLM caller for HyDE hypothesis generation.

    HyDE (Hypothetical Document Embedding) improves retrieval quality by generating a hypothetical answer first, then embedding that answer instead of the raw query. The hypothesis is semantically closer to the stored documents, yielding better vector similarity matches.

    The caller must be set before HyDE-enabled retrieval can be used. Once set, HyDE can be activated per-request via options.hyde.enabled on retrieveContext, or it can be activated globally by passing a default HyDE config.

    Parameters

    • llmCaller: HydeLlmCaller

      An async function that takes (systemPrompt, userPrompt) and returns the LLM completion text. The system prompt contains instructions for hypothesis generation; the user prompt is the query.

    Returns void

    Example

    augmentor.setHydeLlmCaller(async (systemPrompt, userPrompt) => {
    const response = await openai.chat.completions.create({
    model: 'gpt-4o-mini',
    messages: [
    { role: 'system', content: systemPrompt },
    { role: 'user', content: userPrompt },
    ],
    max_tokens: 200,
    });
    return response.choices[0].message.content ?? '';
    });
  • Parameters

    • documentIds: string[]
    • Optional dataSourceId: string
    • Optional options: {
          ignoreNotFound?: boolean;
      }
      • Optional ignoreNotFound?: boolean

    Returns Promise<{
        successCount: number;
        failureCount: number;
        errors?: {
            documentId: string;
            message: string;
            details?: any;
        }[];
    }>

    Inherit Doc

Properties

augmenterId: string