Documentation Index
Fetch the complete documentation index at:https://docs.llmrouter.app/llms.txt Use this file to discover all available models, tags, and features before exploring further.
LibreChat Integration
LibreChat is a popular open-source AI chat platform that you can self-host. It supports multiple providers and gives you a beautiful ChatGPT-like interface. By connecting LibreChat to LLM Router, you get intelligent tag-based routing, automatic Skill injection, context optimization, Zero Data Retention (ZDR), and access to the best models from Anthropic, OpenAI, Google, xAI, and more through a single endpoint.Configuring LibreChat with LLM Router
1. Install LibreChat
Note: Windows users can usecopyinstead ofcp. Docker Desktop is required.
2. Create Docker Override File
Create a file nameddocker-compose.override.yml in the root directory:
3. Add API Key to Environment
Add your LLM Router API key to the.env file:
4. Configure Custom Endpoint
Create alibrechat.yaml file in the root directory:
Tip: Setting fetch: true automatically pulls all available models from LLM Router.
5. Start LibreChat
http://localhost:3080
6. Select LLM Router Endpoint
In the LibreChat interface:- Click the endpoint dropdown at the top
- Select LLM Router
- Choose your preferred model
Configuration Options
You can further customize the endpoint inlibrechat.yaml:
titleConvo: Enable automatic conversation titlestitleModel: Model used for generating titlesdropParams: Remove unsupported parametersmodelDisplayLabel: Custom label for the endpoint