Interface LoopContext

Execution context provided to the LoopController by the caller. Abstracts away the underlying LLM/GMI implementation so the loop logic remains provider-agnostic.

interface LoopContext {
    generateStream: (() => AsyncGenerator<LoopChunk, LoopOutput, undefined>);
    executeTool: ((toolCall) => Promise<LoopToolCallResult>);
    addToolResults: ((results) => void);
}

Properties

generateStream: (() => AsyncGenerator<LoopChunk, LoopOutput, undefined>)

Async generator that streams chunks during a single LLM inference pass. Must return a LoopOutput as its generator return value (the value passed to the final done: true result from .next()).

Type declaration

    • (): AsyncGenerator<LoopChunk, LoopOutput, undefined>
    • Returns AsyncGenerator<LoopChunk, LoopOutput, undefined>

executeTool: ((toolCall) => Promise<LoopToolCallResult>)

Execute a single tool call and return its result. Implementations should never throw — instead return a result with success: false and a populated error field.

Type declaration

addToolResults: ((results) => void)

Feed tool results back into the conversation so the next generateStream call has access to them. Typically appends tool messages to the message list.

Type declaration