Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.llmrouter.app/llms.txt
If your codebase is already heavily integrated with the official @anthropic-ai/sdk, you don’t need to rewrite it to use the OpenAI format. LLM Router provides a fully compatible Anthropic endpoint. By routing the Anthropic SDK through LLM Router, you instantly gain PII Redaction, Context Compression, and Automatic Fallbacks (e.g., falling back to Amazon Bedrock if Anthropic’s direct API is down).

Configuration

To use LLM Router, you simply override the baseURL when initializing the Anthropic client.
import Anthropic from "@anthropic-ai/sdk";

// 1. Initialize with LLM Router credentials
const anthropic = new Anthropic({
  baseURL: "https://api.llmrouter.app",
  apiKey: process.env.LLM_ROUTER_API_KEY,
});

async function main() {
  const msg = await anthropic.messages.create({
    // 2. Use standard Anthropic models
    model: "anthropic/claude-3-5-sonnet",
    max_tokens: 1024,
    messages: [{ role: "user", content: "Write a Python script." }],
  });

  console.log(msg.content);
}
main();