CLI Setup Reference
This page is the full reference foropenclaw onboard.
For the short guide, see Onboarding (CLI).
What the wizard does
Local mode (default) walks you through:- Model and auth setup (OpenAI Code subscription OAuth, Anthropic Claude CLI or API key, plus MiniMax, GLM, Ollama, Moonshot, StepFun, and AI Gateway options)
- Workspace location and bootstrap files
- Gateway settings (port, bind, auth, tailscale)
- Channels and providers (Telegram, WhatsApp, Discord, Google Chat, Mattermost, Signal, BlueBubbles, and other bundled channel plugins)
- Daemon install (LaunchAgent, systemd user unit, or native Windows Scheduled Task with Startup-folder fallback)
- Health check
- Skills setup
Local flow details
Existing config detection
- If
~/.openclaw/openclaw.jsonexists, choose Keep, Modify, or Reset. - Re-running the wizard does not wipe anything unless you explicitly choose Reset (or pass
--reset). - CLI
--resetdefaults toconfig+creds+sessions; use--reset-scope fullto also remove workspace. - If config is invalid or contains legacy keys, the wizard stops and asks you to run
openclaw doctorbefore continuing. - Reset uses
trashand offers scopes:- Config only
- Config + credentials + sessions
- Full reset (also removes workspace)
Model and auth
- Full option matrix is in Auth and model options.
Workspace
- Default
~/.openclaw/workspace(configurable). - Seeds workspace files needed for first-run bootstrap ritual.
- Workspace layout: Agent workspace.
Gateway
- Prompts for port, bind, auth mode, and tailscale exposure.
- Recommended: keep token auth enabled even for loopback so local WS clients must authenticate.
- In token mode, interactive setup offers:
- Generate/store plaintext token (default)
- Use SecretRef (opt-in)
- In password mode, interactive setup also supports plaintext or SecretRef storage.
- Non-interactive token SecretRef path:
--gateway-token-ref-env <ENV_VAR>.- Requires a non-empty env var in the onboarding process environment.
- Cannot be combined with
--gateway-token.
- Disable auth only if you fully trust every local process.
- Non-loopback binds still require auth.
Channels
- WhatsApp: optional QR login
- Telegram: bot token
- Discord: bot token
- Google Chat: service account JSON + webhook audience
- Mattermost: bot token + base URL
- Signal: optional
signal-cliinstall + account config - BlueBubbles: recommended for iMessage; server URL + password + webhook
- iMessage: legacy
imsgCLI path + DB access - DM security: default is pairing. First DM sends a code; approve via
openclaw pairing approve <channel> <code>or use allowlists.
Daemon install
- macOS: LaunchAgent
- Requires logged-in user session; for headless, use a custom LaunchDaemon (not shipped).
- Linux and Windows via WSL2: systemd user unit
- Wizard attempts
loginctl enable-linger <user>so gateway stays up after logout. - May prompt for sudo (writes
/var/lib/systemd/linger); it tries without sudo first.
- Wizard attempts
- Native Windows: Scheduled Task first
- If task creation is denied, OpenClaw falls back to a per-user Startup-folder login item and starts the gateway immediately.
- Scheduled Tasks remain preferred because they provide better supervisor status.
- Runtime selection: Node (recommended; required for WhatsApp and Telegram). Bun is not recommended.
Health check
- Starts gateway (if needed) and runs
openclaw health. openclaw status --deepadds the live gateway health probe to status output, including channel probes when supported.
Skills
- Reads available skills and checks requirements.
- Lets you choose node manager: npm, pnpm, or bun.
- Installs optional dependencies (some use Homebrew on macOS).
If no GUI is detected, the wizard prints SSH port-forward instructions for the Control UI instead of opening a browser.
If Control UI assets are missing, the wizard attempts to build them; fallback is
pnpm ui:build (auto-installs UI deps).Remote mode details
Remote mode configures this machine to connect to a gateway elsewhere.Remote mode does not install or modify anything on the remote host.
- Remote gateway URL (
ws://...) - Token if remote gateway auth is required (recommended)
- If gateway is loopback-only, use SSH tunneling or a tailnet.
- Discovery hints:
- macOS: Bonjour (
dns-sd) - Linux: Avahi (
avahi-browse)
- macOS: Bonjour (
Auth and model options
Anthropic API key
Anthropic API key
Uses
ANTHROPIC_API_KEY if present or prompts for a key, then saves it for daemon use.Anthropic Claude CLI
Anthropic Claude CLI
Reuses a local Claude CLI login on the gateway host and switches model
selection to a canonical
claude-cli/claude-* ref.This is an available local fallback path in openclaw onboard and
openclaw configure. For production, prefer an Anthropic API key.- macOS: checks Keychain item “Claude Code-credentials”
- Linux and Windows: reuses
~/.claude/.credentials.jsonif present
OpenAI Code subscription (Codex CLI reuse)
OpenAI Code subscription (Codex CLI reuse)
If
~/.codex/auth.json exists, the wizard can reuse it.
Reused Codex CLI credentials stay managed by Codex CLI; on expiry OpenClaw
re-reads that source first and, when the provider can refresh it, writes
the refreshed credential back to Codex storage instead of taking ownership
itself.OpenAI Code subscription (OAuth)
OpenAI Code subscription (OAuth)
Browser flow; paste
code#state.Sets agents.defaults.model to openai-codex/gpt-5.4 when model is unset or openai/*.OpenAI API key
OpenAI API key
Uses
OPENAI_API_KEY if present or prompts for a key, then stores the credential in auth profiles.Sets agents.defaults.model to openai/gpt-5.4 when model is unset, openai/*, or openai-codex/*.xAI (Grok) API key
xAI (Grok) API key
Prompts for
XAI_API_KEY and configures xAI as a model provider.OpenCode
OpenCode
Prompts for
OPENCODE_API_KEY (or OPENCODE_ZEN_API_KEY) and lets you choose the Zen or Go catalog.
Setup URL: opencode.ai/auth.API key (generic)
API key (generic)
Stores the key for you.
Vercel AI Gateway
Vercel AI Gateway
Prompts for
AI_GATEWAY_API_KEY.
More detail: Vercel AI Gateway.Cloudflare AI Gateway
Cloudflare AI Gateway
Prompts for account ID, gateway ID, and
CLOUDFLARE_AI_GATEWAY_API_KEY.
More detail: Cloudflare AI Gateway.MiniMax
MiniMax
Config is auto-written. Hosted default is
MiniMax-M2.7; API-key setup uses
minimax/..., and OAuth setup uses minimax-portal/....
More detail: MiniMax.StepFun
StepFun
Config is auto-written for StepFun standard or Step Plan on China or global endpoints.
Standard currently includes
step-3.5-flash, and Step Plan also includes step-3.5-flash-2603.
More detail: StepFun.Synthetic (Anthropic-compatible)
Synthetic (Anthropic-compatible)
Prompts for
SYNTHETIC_API_KEY.
More detail: Synthetic.Ollama (Cloud and local open models)
Ollama (Cloud and local open models)
Prompts for base URL (default
http://127.0.0.1:11434), then offers Cloud + Local or Local mode.
Discovers available models and suggests defaults.
More detail: Ollama.Moonshot and Kimi Coding
Moonshot and Kimi Coding
Moonshot (Kimi K2) and Kimi Coding configs are auto-written.
More detail: Moonshot AI (Kimi + Kimi Coding).
Custom provider
Custom provider
Works with OpenAI-compatible and Anthropic-compatible endpoints.Interactive onboarding supports the same API key storage choices as other provider API key flows:
- Paste API key now (plaintext)
- Use secret reference (env ref or configured provider ref, with preflight validation)
--auth-choice custom-api-key--custom-base-url--custom-model-id--custom-api-key(optional; falls back toCUSTOM_API_KEY)--custom-provider-id(optional)--custom-compatibility <openai|anthropic>(optional; defaultopenai)
Skip
Skip
Leaves auth unconfigured.
- Pick default model from detected options, or enter provider and model manually.
- When onboarding starts from a provider auth choice, the model picker prefers
that provider automatically. For Volcengine and BytePlus, the same preference
also matches their coding-plan variants (
volcengine-plan/*,byteplus-plan/*). - If that preferred-provider filter would be empty, the picker falls back to the full catalog instead of showing no models.
- Wizard runs a model check and warns if the configured model is unknown or missing auth.
- Auth profiles (API keys + OAuth):
~/.openclaw/agents/<agentId>/agent/auth-profiles.json - Legacy OAuth import:
~/.openclaw/credentials/oauth.json
- Default onboarding behavior persists API keys as plaintext values in auth profiles.
--secret-input-mode refenables reference mode instead of plaintext key storage. In interactive setup, you can choose either:- environment variable ref (for example
keyRef: { source: "env", provider: "default", id: "OPENAI_API_KEY" }) - configured provider ref (
fileorexec) with provider alias + id
- environment variable ref (for example
- Interactive reference mode runs a fast preflight validation before saving.
- Env refs: validates variable name + non-empty value in the current onboarding environment.
- Provider refs: validates provider config and resolves the requested id.
- If preflight fails, onboarding shows the error and lets you retry.
- In non-interactive mode,
--secret-input-mode refis env-backed only.- Set the provider env var in the onboarding process environment.
- Inline key flags (for example
--openai-api-key) require that env var to be set; otherwise onboarding fails fast. - For custom providers, non-interactive
refmode storesmodels.providers.<id>.apiKeyas{ source: "env", provider: "default", id: "CUSTOM_API_KEY" }. - In that custom-provider case,
--custom-api-keyrequiresCUSTOM_API_KEYto be set; otherwise onboarding fails fast.
- Gateway auth credentials support plaintext and SecretRef choices in interactive setup:
- Token mode: Generate/store plaintext token (default) or Use SecretRef.
- Password mode: plaintext or SecretRef.
- Non-interactive token SecretRef path:
--gateway-token-ref-env <ENV_VAR>. - Existing plaintext setups continue to work unchanged.
Headless and server tip: complete OAuth on a machine with a browser, then copy
that agent’s
auth-profiles.json (for example
~/.openclaw/agents/<agentId>/agent/auth-profiles.json, or the matching
$OPENCLAW_STATE_DIR/... path) to the gateway host. credentials/oauth.json
is only a legacy import source.Outputs and internals
Typical fields in~/.openclaw/openclaw.json:
agents.defaults.workspaceagents.defaults.model/models.providers(if Minimax chosen)tools.profile(local onboarding defaults to"coding"when unset; existing explicit values are preserved)gateway.*(mode, bind, auth, tailscale)session.dmScope(local onboarding defaults this toper-channel-peerwhen unset; existing explicit values are preserved)channels.telegram.botToken,channels.discord.token,channels.matrix.*,channels.signal.*,channels.imessage.*- Channel allowlists (Slack, Discord, Matrix, Microsoft Teams) when you opt in during prompts (names resolve to IDs when possible)
skills.install.nodeManager- The
setup --node-managerflag acceptsnpm,pnpm, orbun. - Manual config can still set
skills.install.nodeManager: "yarn"later.
- The
wizard.lastRunAtwizard.lastRunVersionwizard.lastRunCommitwizard.lastRunCommandwizard.lastRunMode
openclaw agents add writes agents.list[] and optional bindings.
WhatsApp credentials go under ~/.openclaw/credentials/whatsapp/<accountId>/.
Sessions are stored under ~/.openclaw/agents/<agentId>/sessions/.
Some channels are delivered as plugins. When selected during setup, the wizard
prompts to install the plugin (npm or local path) before channel configuration.
wizard.startwizard.nextwizard.cancelwizard.status
- Downloads the appropriate release asset
- Stores it under
~/.openclaw/tools/signal-cli/<version>/ - Writes
channels.signal.cliPathin config - JVM builds require Java 21
- Native builds are used when available
- Windows uses WSL2 and follows Linux signal-cli flow inside WSL
Related docs
- Onboarding hub: Onboarding (CLI)
- Automation and scripts: CLI Automation
- Command reference:
openclaw onboard