If your codebase is already heavily integrated with the officialDocumentation Index
Fetch the complete documentation index at: https://docs.llmrouter.app/llms.txt
@anthropic-ai/sdk, you don’t need to rewrite it to use the OpenAI format. LLM Router provides a fully compatible Anthropic endpoint.
By routing the Anthropic SDK through LLM Router, you instantly gain PII Redaction, Context Compression, and Automatic Fallbacks (e.g., falling back to Amazon Bedrock if Anthropic’s direct API is down).
Configuration
To use LLM Router, you simply override thebaseURL when initializing the Anthropic client.