Documentation Index
Fetch the complete documentation index at:https://docs.llmrouter.app/llms.txt Use this file to discover all available models, tags, and features before exploring further.
LiteLLM Integration
LiteLLM is one of the most popular open-source libraries for calling LLMs. It provides a unified OpenAI-compatible interface for 100+ models and providers. Since LLM Router is fully OpenAI-compatible, you can use it seamlessly with LiteLLM to get intelligent routing, Skills, Zero Data Retention, and automatic cost optimization.Installation
Basic Configuration
You can use LLM Router with LiteLLM in two ways:Method 1: Using base_url (Recommended)
Method 2: Set Environment Variables (Global)
Using Advanced LLM Router Features
Recommended Models
Best Practice: Use Dashboard Configuration
Instead of passing options in every request, we strongly recommend configuring defaults in the LLM Router Dashboard:- Set default tags (
coding,reasoning,ui design, etc.) - Enable relevant Skills
- Turn on Zero Data Retention (ZDR)
- Configure context optimization rules
Async Usage Example
Pro Tip: LiteLLM + LLM Router is an excellent combination for production applications — you get LiteLLM’s excellent fallback, logging, and observability features on top of LLM Router’s smart routing and cost-saving capabilities.