CrabTalkCrabTalk

Free Provider Keys

Get free LLM API keys to use with CrabDash — no credit card required.

CrabDash routes requests to cloud providers using your own API keys. Keys are stored in the macOS Keychain — they never leave your machine.

Every provider below offers genuinely free API access — no credit card, no expiring trial credits. All are OpenAI-compatible, so they work with CrabDash out of the box.

Google Gemini

The easiest way to start. No billing required.

  1. Go to aistudio.google.com/apikey
  2. Click Create API key
  3. Add it in CrabDash → Settings → Providers → Google

Free tier includes Gemini 2.0 Flash and Gemini 1.5 Pro at ~15 RPM. No per-token billing. Good enough for development and personal use.

Groq

Fast inference on open models.

  1. Go to console.groq.com/keys
  2. Click Create API Key
  3. Add it in CrabDash → Settings → Providers → Groq

Free tier includes Llama 3, Mixtral, and Gemma. ~30 RPM, ~14,400 requests/day.

NVIDIA NIM

Largest free model catalog — 100+ models.

  1. Go to build.nvidia.com and join the Developer Program
  2. Generate an API key
  3. Add it in CrabDash → Settings → Providers → NVIDIA

Free tier includes Llama 4, DeepSeek R1/V3, Qwen3, Mistral, and many more. ~40 RPM per model.

Cloudflare Workers AI

Edge inference in 300+ cities.

  1. Create a free account at dash.cloudflare.com
  2. Go to Workers & Pages → AI and create an API token
  3. Add it in CrabDash → Settings → Providers → Cloudflare

Free tier includes 50+ models (Llama 3.2 variants, Mistral 7B, others). 10,000 Neurons/day.

Mistral

All Mistral models on the free Experiment plan.

  1. Go to console.mistral.ai
  2. Create an account (defaults to Experiment plan) and generate an API key
  3. Add it in CrabDash → Settings → Providers → Mistral

Free tier includes Mistral Large, Small, Codestral, and Pixtral. 1B tokens/month but low rate limit (1–2 RPM). Requests may be used for model training.

Running without cloud providers

You don't need any API key to use CrabDash with local models on Apple Silicon. See Local Models.

On this page