Amazon Bedrock Mantle
OpenClaw includes a bundled Amazon Bedrock Mantle provider that connects to the Mantle OpenAI-compatible endpoint. Mantle hosts open-source and third-party models (GPT-OSS, Qwen, Kimi, GLM, and similar) through a standard/v1/chat/completions surface backed by Bedrock infrastructure.
What OpenClaw supports
- Provider:
amazon-bedrock-mantle - API:
openai-completions(OpenAI-compatible) - Auth: bearer token via
AWS_BEARER_TOKEN_BEDROCK - Region:
AWS_REGIONorAWS_DEFAULT_REGION(default:us-east-1)
Automatic model discovery
WhenAWS_BEARER_TOKEN_BEDROCK is set, OpenClaw automatically discovers
available Mantle models by querying the region’s /v1/models endpoint.
Discovery results are cached for 1 hour.
Supported regions: us-east-1, us-east-2, us-west-2, ap-northeast-1,
ap-south-1, ap-southeast-3, eu-central-1, eu-west-1, eu-west-2,
eu-south-1, eu-north-1, sa-east-1.
Onboarding
- Set the bearer token on the gateway host:
- Verify models are discovered:
amazon-bedrock-mantle provider. No
additional config is required unless you want to override defaults.
Manual configuration
If you prefer explicit config instead of auto-discovery:Notes
- Mantle requires a bearer token today. Plain IAM credentials (instance roles, SSO, access keys) are not sufficient without a token.
- The bearer token is the same
AWS_BEARER_TOKEN_BEDROCKused by the standard Amazon Bedrock provider. - Reasoning support is inferred from model IDs containing patterns like
thinking,reasoner, orgpt-oss-120b. - If the Mantle endpoint is unavailable or returns no models, the provider is silently skipped.