Documentation Index
Fetch the complete documentation index at: https://docs.llmrouter.app/llms.txt
Use this file to discover all available pages before exploring further.
Crush & LLM Router
Crush (or similar terminal-based AI agents) is a powerful CLI/TUI tool that handles code generation, debugging, and file operations directly inside your command line.
Because terminal agents frequently read massive error logs, command outputs, and local files, they consume huge amounts of input tokens. By routing Crush through LLM Router, you automatically apply Middle-Out Context Compression to bloated terminal outputs and ensure your local .env secrets are Redacted before they are sent to third-party AI providers.
Important: When using LLM Router with custom CLI tools, you must configure
the base URL to point to https://api.llmrouter.app/v1 to ensure OpenAI SDK
compatibility.
Step 1: Installing Crush
If you haven’t installed Crush yet, select the appropriate installation method based on your system:
Homebrew (macOS/Linux)
NPM (Cross-Platform)
Arch Linux
Nix
brew install charmbracelet/tap/crush
bash npm install -g @charmland/crush
bash yay -S crush-bin
nix run github:numtide/nix-ai-tools#crush
Step 2: Modifying the Crush Configuration
To point Crush to LLM Router, you need to edit its local configuration file and add LLM Router as a custom provider.
1. Locate the Configuration File
Depending on your operating system, the crush.json configuration file can be found here:
~/.config/crush/crush.json
2. Add LLM Router as a Provider
Open the crush.json file and add the LLM Router endpoint under your providers list. Make sure to replace sk-router-your-api-key with your actual LLM Router API key.
{
"providers": {
"llmrouter": {
"id": "llmrouter",
"name": "LLM Router",
"base_url": "https://api.llmrouter.app/v1",
"api_key": "sk-router-your-api-key"
}
}
}
API Key Location: You can generate your LLM Router API Key in the
Dashboard. Your key is stored locally by
Crush and is never exposed.
Step 3: Launch Crush and Select Your Model
- Run the
crush command in your terminal to start the application:
-
Open the command palette (usually
ctrl+p) and choose “Switch Model”.
-
Because LLM Router acts as a universal gateway, you can type the name of any supported model, prefixed with the provider. For example:
anthropic/claude-3-5-sonnet (Best for heavy coding/refactoring)
openai/gpt-4o
deepseek/deepseek-chat (Highly cost-efficient)
router-auto (If you want LLM Router to dynamically pick the model based on your Tags)
Step 4: The LLM Router Advantage in the Terminal
Once configured, you can use Crush as normal to generate code, execute terminal commands, and debug.
Because traffic is now flowing through LLM Router, you instantly gain:
- Token Savings on Logs: If Crush runs
npm run build and ingests 3,000 lines of terminal output, LLM Router will dynamically compress the repetitive middle sections, saving you up to 80% on API costs.
- Zero-Trust Security: If Crush accidentally reads your
.npmrc or AWS credentials from your local workspace, LLM Router’s PII engine will redact them into [TOKEN_REDACTED] before sending the context to Claude or GPT-4.
- Resilience: If your primary AI provider experiences an outage, LLM Router will automatically fall back to your configured secondary provider without interrupting your CLI session.