A LiteLLM provider that makes Claude Code SDK available through the standard OpenAI-compatible API interface. Based on Anthropic's official Claude Code LLM Gateway docs based on our:
┌─────────────────┐ ╭──────────────╮ ┌─────────────────┐
│ │ │ │ │ Open WebUI, │
│ Claude Code │ ◄─────► │ LiteLLM │ ◄─────► │ Grafiti, │
│ │ │ │ │ LangChain, etc. │
└─────────────────┘ ╰──────────────╯ └─────────────────┘
OAuth/API Translation OpenAI Compatible App
docker pull ghcr.io/cabinlab/litellm-claude-code:latest- ✅ Claude Pro/Max Plan: Uses Claude Code's OAuth authentication (no API keys needed)
- ✅ OpenAI API Compatibility: Use Claude Code in apps expecting OpenAI API keys
- ✅ Docker Deployment: LiteLLM + Claude Code SDK in a single container
- ✅ Model Selection: Supports all Claude models (Opus, Sonnet, Haiku)
- ✅ Standard Interface: Drop-in replacement for OpenAI API
- Docker
- Claude Pro or Max subscription OR Anthropic API key
-
Download the configuration files:
# Create a directory for the project mkdir litellm-claude-code && cd litellm-claude-code # Download docker-compose.yml curl -O https://raw.githubusercontent.com/cabinlab/litellm-claude-code/main/docker-compose.yml # Download .env.example curl -o .env https://raw.githubusercontent.com/cabinlab/litellm-claude-code/main/.env.example
Or if you prefer to clone the repository:
git clone https://github.com/cabinlab/litellm-claude-code.git cd litellm-claude-code cp .env.example .env -
Set your master key (REQUIRED):
# Edit .env and update LITELLM_MASTER_KEY LITELLM_MASTER_KEY=sk-your-desired-custom-key⚠️ See Security Guide for key generation best practices -
Get your Claude OAuth token (wherever you have Claude Code installed):
# If you don't have the Claude CLI installed: npm install -g @anthropic-ai/claude-code # Generate a long-lived token claude setup-token # Follow the web-based auth flow and copy the token that starts with sk-ant-oat01-
-
Add the token to your .env file:
# Edit .env and add your token: CLAUDE_CODE_OAUTH_TOKEN=sk-ant-oat01-your-token-here -
Start the services:
docker-compose up -d
-
Verify it's working:
Navigate to
http://localhost:4000/ui/and click "Test Key" to try the interactive interface:# Check health (replace with your LITELLM_MASTER_KEY) curl http://localhost:4000/health -H "Authorization: Bearer sk-your-desired-custom-key" # List models curl http://localhost:4000/v1/models -H "Authorization: Bearer sk-your-desired-custom-key"
The API is now available at http://localhost:4000/v1
| Model Name | Description |
|---|---|
sonnet |
Claude Sonnet (latest) |
opus |
Claude Opus (latest) |
claude-3-5-haiku-20241022 |
Claude 3.5 Haiku |
default |
Starts with Opus, falls back to Sonnet |
The default settings automatically use the latest Sonnet and Opus models, and the last-known release of Haiku. You can try other models from Anthropic's list:
- Edit
config/litellm_config.yamlto add/modify models:
model_list:
- model_name: {Anthropic's official model name}
litellm_params:
model: claude-code-sdk/claude-model- Restart the container
docker-compose restart litellm-
Long-lived OAuth Tokens (Recommended for Claude Pro/Max users)
- Generate with
claude setup-tokenon your host machine - Set
CLAUDE_CODE_OAUTH_TOKENin your.envfile - Tokens start with
sk-ant-oat01-and last for 1 year - Authentication persists across container restarts via Docker volume
- Generate with
-
Interactive Authentication (Alternative)
View steps
# Enter the container docker exec -it litellm-claude-litellm-1 bash # Run claude to authenticate claude # Follow the browser authentication flow
-
Anthropic API Keys
View details
- Can set
ANTHROPIC_API_KEYin.env - May override Pro/Max subscription benefits
- Uses API credits instead of subscription
- Can set
The Docker setup includes a named volume for authentication:
volumes:
- claude-auth:/home/claude/.claudeThis ensures authentication persists across container restarts.
Client Application → LiteLLM Proxy → Claude Code SDK Provider → Claude Code SDK → Claude API
The provider:
- Receives OpenAI-format requests from LiteLLM
- Converts messages to Claude prompt format
- Extracts model name and creates
ClaudeCodeOptions(model=...) - Calls Claude Code SDK with OAuth authentication
- Returns response in OpenAI format
View Python example
from openai import OpenAI
client = OpenAI(
api_key="sk-your-desired-custom-key", # Your LITELLM_MASTER_KEY from .env
base_url="http://localhost:4000/v1"
)
response = client.chat.completions.create(
model="sonnet",
messages=[{"role": "user", "content": "Hello, Claude!"}]
)
print(response.choices[0].message.content)📚 See Usage Examples for more languages and frameworks (cURL, LangChain, JavaScript, Graphiti, etc.)
View common issues and solutions
"Invalid API key" or 401 Unauthorized
- Ensure you're using your
LITELLM_MASTER_KEYvalue (not the OAuth token) - The master key must start with
sk- - Check your Authorization header:
Bearer sk-your-desired-custom-key
"Authentication failed" from Claude SDK
- Your OAuth token may have expired
- Regenerate with
claude setup-tokenon your host machine - Update
CLAUDE_CODE_OAUTH_TOKENin your.envfile - Restart the container:
docker-compose restart litellm
"Model not found"
- Check available models:
curl http://localhost:4000/v1/models -H "Authorization: Bearer sk-your-desired-custom-key" - Valid model names:
sonnet,opus,claude-3-5-haiku-20241022,default - Model names are case-sensitive
Slow responses or timeouts
- The first request after startup may be slower while establishing connections
- Claude Code SDK responses can take 5-10 seconds for complex queries
- Consider increasing timeout values in your client application
WebSocket Connection Failed
- Ensure you're accessing the page via HTTP (not HTTPS)
- Check that port 4000 is accessible
- Try refreshing the page
OAuth Page Shows Error
- This is normal - the CLI handles the callback internally
- Return to the terminal to continue authentication
# Quick test
curl -X POST http://localhost:4000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-your-desired-custom-key" \
-d '{
"model": "sonnet",
"messages": [{"role": "user", "content": "Say hello"}]
}'
# With streaming (partial support)
curl -X POST http://localhost:4000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-your-desired-custom-key" \
-d '{
"model": "sonnet",
"messages": [{"role": "user", "content": "Count to 5"}],
"stream": true
}'View build instructions
If you want to build the image locally instead of using the pre-built image:
# Clone the repository
git clone https://github.com/cabinlab/litellm-claude-code.git
cd litellm-claude-code
# Build and run with docker-compose override
docker-compose -f docker-compose.yml -f docker-compose.override.yml up --buildMIT License
