Providers
How CrabTalk routes model requests — API standards, provider config, and supported providers.
CrabTalk supports multiple LLM providers through a unified Model trait. All providers are API-based — configure an endpoint, point the daemon at it, and go. Use crabtalk auth for interactive setup, or edit crab.toml directly.
API standards
Each provider uses a wire format selected by the standard field:
| Standard | Protocol | Used by |
|---|---|---|
openai_compat (default) | OpenAI chat completions API | OpenAI, DeepSeek, Grok, Qwen, Kimi, and any compatible endpoint |
anthropic | Anthropic Messages API | Claude |
google | Google Gemini API | Gemini |
ollama | Ollama local API | Ollama |
azure | Azure OpenAI API | Azure OpenAI |
bedrock | AWS Bedrock API | AWS Bedrock |
If standard is omitted, CrabTalk defaults to openai_compat. If the base_url contains "anthropic", the Anthropic standard is auto-detected.
Provider configuration
Each provider is a [provider.<name>] section in crab.toml. A provider owns one or more models:
[provider.deepseek]
models = ["deepseek-chat", "deepseek-thinking"]
api_key = "sk-..."Model names must be unique across all providers.
Selecting the active model
Set the default model in [system.crab]:
[system.crab]
model = "deepseek-chat"Override per agent in [agents.*]:
[agents.researcher]
model = "claude-opus-4-6"Supported providers
OpenAI
[provider.openai]
models = ["gpt-4o", "gpt-4o-mini"]
api_key = "sk-..."Uses the OpenAI chat completions API (the default standard). Supports GPT-4o, GPT-4o-mini, o-series, and all OpenAI models.
Anthropic Claude
[provider.anthropic]
models = ["claude-sonnet-4-20250514", "claude-opus-4-6"]
api_key = "sk-ant-..."
standard = "anthropic"Uses the Anthropic Messages API. Set standard = "anthropic" explicitly, or it will be auto-detected if your base_url contains "anthropic".
DeepSeek
[provider.deepseek]
models = ["deepseek-chat", "deepseek-thinking"]
api_key = "sk-..."OpenAI-compatible API. Supports deepseek-chat and deepseek-thinking.
Google Gemini
[provider.google]
models = ["gemini-2.5-pro"]
api_key = "..."
standard = "google"Grok
[provider.grok]
models = ["grok-3"]
api_key = "..."
base_url = "https://api.x.ai/v1"OpenAI-compatible API. Requires explicit base_url.
Qwen
[provider.qwen]
models = ["qwen-plus"]
api_key = "..."
base_url = "https://dashscope.aliyuncs.com/compatible-mode/v1"OpenAI-compatible via DashScope.
Kimi
[provider.kimi]
models = ["kimi-latest"]
api_key = "..."
base_url = "https://api.moonshot.cn/v1"OpenAI-compatible API by Moonshot AI.
Ollama
[provider.ollama]
models = ["llama3.1"]
standard = "ollama"
base_url = "http://localhost:11434/v1"No API key needed.
Azure OpenAI
[provider.azure]
models = ["gpt-4o"]
api_key = "..."
standard = "azure"
api_version = "2024-02-01"
base_url = "https://your-resource.openai.azure.com"AWS Bedrock
[provider.bedrock]
models = ["anthropic.claude-3-5-sonnet"]
standard = "bedrock"
region = "us-east-1"
access_key = "..."
secret_key = "..."Custom OpenAI-compatible endpoints
Any OpenAI-compatible API works with a base_url:
[provider.my-provider]
models = ["my-model"]
api_key = "..."
base_url = "https://my-endpoint.com/v1"Provider manager
The ProviderManager holds all configured providers and routes requests by model name. It supports hot-reload — update the config and the active provider changes without restarting the daemon.
What's next
- Configuration — full config setup
- Commands — telegram, search, and custom commands
- Auth — manage providers interactively