Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.llmrouter.app/llms.txt Use this file to discover all available pages before exploring further.

OpenClaw & LLM Router

OpenClaw is a powerful open-source AI coding agent designed for the terminal. It excels at autonomous code generation, project scaffolding, debugging, and multi-step development tasks. Because autonomous agents like OpenClaw rapidly execute commands, read large files, and ingest massive terminal outputs, they are highly prone to token bloat and accidental secret leaks. By connecting OpenClaw to LLM Router, you automatically apply Middle-Out Context Compression to bloated terminal histories and ensure your local .env secrets are Redacted before they reach third-party AI providers.

Step 1: Installing OpenClaw

You can install OpenClaw using one of the following methods:
npm install -g openclaw

Step 2: Configuring LLM Router

OpenClaw natively supports custom OpenAI-compatible providers, making it incredibly simple to connect to LLM Router.
1

Get Your API Key

Log into the LLM Router Dashboard and generate a new API key (e.g., sk-router-...).
2

Edit the OpenClaw Config

OpenClaw uses a local JSON file for configuration. Open your terminal and run the configuration command: bash openclaw config Alternatively, you can manually edit the file located at ~/.openclaw/config.json.
3

Add LLM Router Settings

Update the configuration file to point the baseUrl to LLM Router. Make sure the provider is set to "openai".
~/.openclaw/config.json
{
  "provider": "openai",
  "baseUrl": "https://api.llmrouter.app/v1",
  "apiKey": "sk-router-your-api-key-here",

  // You can use any model supported by LLM Router
  "model": "anthropic/claude-3-5-sonnet",
  "temperature": 0.7
}
Model Selection: Because LLM Router acts as a universal gateway, you must prefix your chosen model with the provider slug (e.g., anthropic/claude-3-5-sonnet, deepseek/deepseek-chat, or openai/gpt-4o).

Step 3: Start Using OpenClaw

With the configuration saved, you can now launch OpenClaw. All traffic is securely routed through LLM Router. Run OpenClaw in interactive mode:
openclaw
Or pass a specific task directly:
openclaw "Implement user authentication with JWT and refresh tokens"

The LLM Router Advantage

While OpenClaw builds your features, LLM Router is working in the background:
  • Token Savings: When OpenClaw reads a 2,000-line file, LLM Router dynamically compresses the irrelevant middle sections if the context shifts, saving you up to 80% on API costs.
  • Security (ZDR & Redaction): If OpenClaw accidentally reads an AWS Key from your local workspace, LLM Router’s PII engine will redact it into [TOKEN_REDACTED] before sending the code to Anthropic or OpenAI.
  • Reliability: If the primary AI provider experiences an outage, LLM Router will automatically fall back to a secondary provider without breaking OpenClaw’s autonomous loop.