An OpenAI client instance from the openai package.
OptionalmodelOpenAI model to use for judging.
OptionalmaxMaximum tokens in the judge's response.
OptionaltemperatureSampling temperature for the judge LLM.
Set to 0 for reproducible, deterministic evaluation runs.
Leave undefined to use the provider's default.
OptionalretriesNumber of retry attempts for transient errors (rate limits, 5xx).
Configuration for the OpenAI provider.
Uses structural typing for the client so you can pass any compatible object — works with Azure OpenAI, proxies, or mocks in tests.