Skip to main content

Documentation Index

Fetch the complete documentation index at:
https://docs.llmrouter.app/llms.txt
Use this file to discover all available models, tags, and features before exploring further.

LibreChat Integration

LibreChat is a popular open-source AI chat platform that you can self-host. It supports multiple providers and gives you a beautiful ChatGPT-like interface. By connecting LibreChat to LLM Router, you get intelligent tag-based routing, automatic Skill injection, context optimization, Zero Data Retention (ZDR), and access to the best models from Anthropic, OpenAI, Google, xAI, and more through a single endpoint.

Configuring LibreChat with LLM Router

1. Install LibreChat

git clone https://github.com/danny-avila/LibreChat.git
cd LibreChat
cp .env.example .env
Note: Windows users can use copy instead of cp. Docker Desktop is required.

2. Create Docker Override File

Create a file named docker-compose.override.yml in the root directory:
services:
  api:
    volumes:
      - type: bind
        source: ./librechat.yaml
        target: /app/librechat.yaml

3. Add API Key to Environment

Add your LLM Router API key to the .env file:
LLM_ROUTER_API_KEY=sk_llmr_your_key_here

4. Configure Custom Endpoint

Create a librechat.yaml file in the root directory:
version: 1.2.8
cache: true

endpoints:
  custom:
    - name: "LLM Router"
      apiKey: "${LLM_ROUTER_API_KEY}"
      baseURL: "https://api.llmrouter.app/v1"
      titleConvo: true
      models:
        default:
          - "anthropic/claude-opus-4.6"
          - "anthropic/claude-sonnet-4.6"
          - "openai/gpt-5.4"
          - "google/gemini-3.1-pro"
          - "xai/grok-4.20"
        fetch: true
      titleModel: "anthropic/claude-sonnet-4.6"
Tip: Setting fetch: true automatically pulls all available models from LLM Router.

5. Start LibreChat

docker compose up -d
Or restart if already running:
docker compose restart
Access LibreChat at http://localhost:3080

6. Select LLM Router Endpoint

In the LibreChat interface:
  1. Click the endpoint dropdown at the top
  2. Select LLM Router
  3. Choose your preferred model

Configuration Options

You can further customize the endpoint in librechat.yaml:
  • titleConvo: Enable automatic conversation titles
  • titleModel: Model used for generating titles
  • dropParams: Remove unsupported parameters
  • modelDisplayLabel: Custom label for the endpoint
See the LibreChat custom endpoints documentation for more options.