Skip to content

Providers

Provider classes — re-exported from PydanticAI under the public API. A Provider handles authentication, endpoint, and HTTP client for an LLM vendor. Attach a Provider to a Model, then pass that Model to Agent.model. There's no separate Agent(provider=...) field by design.

from murmur.models import OpenRouterModel
from murmur.providers import OpenRouterProvider

model = OpenRouterModel(
    "anthropic/claude-sonnet-4-5",
    provider=OpenRouterProvider(api_key="sk-or-..."),
)

See the Models & providers concept guide for when to use the Provider form vs. the string form, and the upstream PydanticAI per-vendor docs for each Provider's constructor knobs (api_key, base_url, http_client, custom auth, etc.).

Available providers

Class Vendor / use case
AnthropicProvider Anthropic API or Anthropic-compatible endpoint
AzureProvider Azure OpenAI Service
BedrockProvider AWS Bedrock-hosted models
CerebrasProvider Cerebras inference
CohereProvider Cohere API
GoogleProvider Gemini API — covers both Google AI Studio and Vertex AI (set vertexai=True on the constructor)
GroqProvider Groq API
HuggingFaceProvider HuggingFace Inference Providers
LiteLLMProvider LiteLLM proxy (use for any vendor LiteLLM routes)
MistralProvider Mistral API
OllamaProvider Ollama local or remote endpoint
OpenAIProvider OpenAI API or OpenAI-compatible endpoint (base_url=...)
OpenRouterProvider OpenRouter API
XaiProvider xAI (Grok) API

Provider classes — re-exported from PydanticAI under the public API.

A :class:Provider handles authentication and the HTTP client used to talk to an LLM vendor. For most agents, the default Provider attached by PydanticAI's string-form model resolution ("openai:gpt-5.2" etc.) is enough — credentials flow from the standard env vars (OPENAI_API_KEY, ANTHROPIC_API_KEY, OPENROUTER_API_KEY, …).

Use a Provider explicitly when you need:

  • An alternative endpoint — Azure OpenAI, Bedrock-hosted Anthropic, Vertex Gemini, a private gateway, or an Ollama instance on another host.
  • Custom auth — Azure AD, IAM-signed requests, bearer tokens.
  • A custom HTTP client — shared connection pool, custom timeouts, mTLS, outbound proxy.
  • A non-default base URL behind one of the OpenAI-compatible vendors — LiteLLM, Together, Fireworks, etc. (Use :class:OpenAIProvider with a base_url=, or :class:LiteLLMProvider for the LiteLLM proxy.)

from murmur.models import OpenRouterModel from murmur.providers import OpenRouterProvider Agent( ... name="researcher", ... model=OpenRouterModel( ... "anthropic/claude-sonnet-4-5", ... provider=OpenRouterProvider(api_key="sk-or-..."), ... ), ... ..., ... )

Re-exporting these here keeps the Public API Rule (CLAUDE.md §2) intact — user code never imports from pydantic_ai directly. PydanticAI ships additional thin OpenAI-compatible Provider shims (Together, Fireworks, DeepSeek, Vercel, Heroku, etc.); for those, use :class:OpenAIProvider with a custom base_url, or :class:LiteLLMProvider for any vendor LiteLLM already routes.