rageval - v0.1.1
    Preparing search index...

    Interface LlmProvider

    Unified provider interface — both Anthropic and OpenAI adapters implement this.

    You can also implement this interface for custom providers (e.g. Gemini, Ollama):

    import type { LlmProvider } from 'rageval'

    const myProvider: LlmProvider = {
    name: 'my-provider',
    model: 'my-model',
    async complete(prompt) {
    // call your LLM and return the text response
    return myLlm.generate(prompt)
    },
    }
    interface LlmProvider {
        name: string;
        model: string;
        complete(prompt: string): Promise<string>;
    }
    Index

    Properties

    Methods

    Properties

    name: string

    Provider name — e.g. 'anthropic', 'openai'. Appears in EvaluationResult.meta.

    model: string

    Model identifier — e.g. 'claude-opus-4-6'. Appears in EvaluationResult.meta.

    Methods

    • Send a prompt to the LLM and return the text response.

      Parameters

      • prompt: string

        The full prompt string to send.

      Returns Promise<string>

      The LLM's text response.