Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.llmrouter.app/llms.txt Use this file to discover all available pages before exploring further.

Factory Droid & LLM Router

Factory Droid is an enterprise-grade AI coding agent that lives in your terminal and handles end-to-end development workflows. Because terminal agents frequently read massive error logs and local .env files, routing Factory Droid through LLM Router is highly recommended. It automatically prunes bloated terminal history to save costs and redacts your local API keys before they are sent to third-party AI providers.

Step 1: Installing Factory Droid

If you haven’t already installed Factory Droid, run the following command for your OS: macOS / Linux:
curl -fsSL https://app.factory.ai/cli | sh
Windows:
irm https://app.factory.ai/cli/windows | iex

Step 2: Configuring LLM Router

Factory Droid supports BYOK (Bring Your Own Key), which allows us to override the default API endpoints and point all traffic directly to LLM Router.

1. Get Your LLM Router API Key

  1. Visit the LLM Router Dashboard
  2. Generate a new API Key (e.g., sk-router-...)

2. Configure Custom Models in Droid

You need to edit Factory Droid’s local settings file. Configuration file location:
  • macOS/Linux: ~/.factory/settings.json
  • Windows: %USERPROFILE%\.factory\settings.json
You can configure LLM Router using either the OpenAI format or the Anthropic format. Remember to replace your_llm_router_key with your actual API Key.
Method A: OpenRouter / OpenAI Protocol (Recommended) Use this to let LLM Router automatically handle routing, or to specifically request OpenAI/Llama models.
{
  "customModels": [
    {
      "displayName": "LLM Router (Auto Fallback)",
      "model": "openai/gpt-4o", // Or use a tag-based routed model
      "baseUrl": "https://api.llmrouter.app/v1",
      "apiKey": "your_llm_router_key",
      "provider": "generic-chat-completion-api",
      "maxOutputTokens": 8192
    }
  ]
}
Method B: Anthropic Protocol Use this if you specifically want to route to Claude models using Anthropic’s native formatting.
{
  "customModels": [
    {
      "displayName": "LLM Router (Claude 3.5 Sonnet)",
      "model": "anthropic/claude-3-5-sonnet",
      "baseUrl": "https://api.llmrouter.app/v1",
      "apiKey": "your_llm_router_key",
      "provider": "anthropic",
      "maxOutputTokens": 8192
    }
  ]
}

Step 3: Start Using Factory Droid

1. Launch Droid

Navigate to your project directory and start the droid CLI:
cd /path/to/your/project
droid

2. Select Your LLM Router Model

Once Droid is running, type the /model command to change your active AI model:
/model
Your custom LLM Router configurations will appear in a separate “Custom models” section in the terminal UI. Select the one you just configured.

3. Start Coding

You can now use Droid to analyze code, implement features, and fix bugs. All traffic is now securely flowing through LLM Router.

Why route Factory Droid through LLM Router?

Terminal-based agents are powerful, but they are incredibly token-hungry and pose unique security risks.
  • PII Redaction: Droid often reads your local directory to understand context. If it accidentally reads your .env file, LLM Router will automatically redact your database passwords and AWS keys before sending the prompt to OpenAI or Anthropic.
  • Context Pruning: If Droid runs a build command (npm run build) and it spits out 10,000 lines of terminal logs, LLM Router’s Middle-Out Compression will truncate the bloated middle sections, saving you massive amounts of input token costs.
  • Resilience: If OpenAI goes down while Droid is refactoring a file, LLM Router will automatically fall back to another provider without breaking your terminal session.