OpenRouter
OpenRouter provides a unified API that routes requests to many models behind a single endpoint and API key. It is OpenAI-compatible, so most OpenAI SDKs work by switching the base URL.CLI setup
Config snippet
Notes
- Model refs are
openrouter/<provider>/<model>. - Onboarding defaults to
openrouter/auto. Switch to a concrete model later withopenclaw models set openrouter/<provider>/<model>. - For more model/provider options, see /concepts/model-providers.
- OpenRouter uses a Bearer token with your API key under the hood.
- On real OpenRouter requests (
https://openrouter.ai/api/v1), OpenClaw also adds OpenRouter’s documented app-attribution headers:HTTP-Referer: https://openclaw.ai,X-OpenRouter-Title: OpenClaw, andX-OpenRouter-Categories: cli-agent. - On verified OpenRouter routes, Anthropic model refs also keep the
OpenRouter-specific Anthropic
cache_controlmarkers that OpenClaw uses for better prompt-cache reuse on system/developer prompt blocks. - If you repoint the OpenRouter provider at some other proxy/base URL, OpenClaw does not inject those OpenRouter-specific headers or Anthropic cache markers.
- OpenRouter still runs through the proxy-style OpenAI-compatible path, so
native OpenAI-only request shaping such as
serviceTier, Responsesstore, OpenAI reasoning-compat payloads, and prompt-cache hints is not forwarded. - Gemini-backed OpenRouter refs stay on the proxy-Gemini path: OpenClaw keeps Gemini thought-signature sanitation there, but does not enable native Gemini replay validation or bootstrap rewrites.
- On supported non-
autoroutes, OpenClaw maps the selected thinking level to OpenRouter proxy reasoning payloads. Unsupported model hints andopenrouter/autoskip that reasoning injection. - If you pass OpenRouter provider routing under model params, OpenClaw forwards it as OpenRouter routing metadata before the shared stream wrappers run.