Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.llmrouter.app/llms.txt Use this file to discover all available pages before exploring further.

Blackbox AI & LLM Router

Blackbox AI is a fast and powerful terminal-based AI coding assistant. It excels at generating code, explaining complex logic, and scaffolding projects directly from your command line. Because terminal assistants frequently ingest large amounts of local code and command outputs, they consume massive amounts of input tokens. By connecting Blackbox AI to LLM Router, you automatically apply Middle-Out Context Compression to bloated terminal histories and ensure your local .env secrets are Redacted before they reach third-party AI providers like OpenAI or Anthropic.

Step 1: Installing Blackbox AI CLI

If you haven’t installed Blackbox AI yet, run the appropriate command for your operating system:
curl -fsSL https://blackbox.ai/install.sh | bash
Note: You may need to restart your terminal after installation.

Step 2: Configuring LLM Router

Blackbox AI natively supports custom OpenAI-compatible providers, making integration with LLM Router straightforward.
1

Get Your API Key

Log into the LLM Router Dashboard and generate a new API key (e.g., sk-router-...).
2

Run the Configuration Tool

Open your terminal and run the built-in configuration wizard: bash blackbox configure
3

Set LLM Router as the Provider

When prompted by the wizard, enter the following settings:
  • Provider: Select OpenAI Compatible (Do not select OpenAI directly, as we need to override the Base URL).
  • Base URL: Enter https://api.llmrouter.app/v1
  • API Key: Paste your LLM Router API key (sk-router-...)
  • Model: Enter the specific model you want to use, prefixed with the provider slug (e.g., anthropic/claude-3-5-sonnet or openai/gpt-4o).
Model Selection: Because LLM Router acts as a universal gateway, you must prefix your chosen model with the provider slug. If you have configured Tag-Based routing in your LLM Router dashboard, you can simply set the model to "router-auto".

Alternative: Environment Variables

If you prefer not to use the interactive configuration wizard (or if you are automating the setup), you can configure Blackbox AI using standard environment variables in your ~/.zshrc or ~/.bashrc:
export OPENAI_BASE_URL="https://api.llmrouter.app/v1"
export OPENAI_API_KEY="sk-router-your-api-key-here"
export OPENAI_MODEL="anthropic/claude-3-5-sonnet"
Don’t forget to run source ~/.zshrc after making changes.

Step 3: Start Using Blackbox AI

Once configured, you can launch Blackbox AI in interactive mode:
blackbox
Or run it with a specific, one-off task:
blackbox "Implement a user authentication system with JWT in Next.js"