# Using UV (recommended)
uv sync
# Or using pip
pip install -r requirements.txtChoose your LLM provider and configure accordingly:
cp .env.example .env
# Edit .env:
# OPENAI_API_KEY="sk-your-openai-key"
# BIG_MODEL="gpt-4o"
# SMALL_MODEL="gpt-4o-mini"cp .env.example .env
# Edit .env:
# OPENAI_API_KEY="your-azure-key"
# OPENAI_BASE_URL="https://your-resource.openai.azure.com/openai/deployments/your-deployment"
# BIG_MODEL="gpt-4"
# SMALL_MODEL="gpt-35-turbo"cp .env.example .env
# Edit .env:
# OPENAI_API_KEY="dummy-key"
# OPENAI_BASE_URL="http://localhost:11434/v1"
# BIG_MODEL="llama3.1:70b"
# SMALL_MODEL="llama3.1:8b"# Start the proxy server
python start_proxy.py
# In another terminal, use with Claude Code
ANTHROPIC_BASE_URL=http://localhost:8082 claude| Your Input | Proxy Action | Result |
|---|---|---|
Claude Code sends claude-3-5-sonnet-20241022 |
Maps to your BIG_MODEL |
Uses gpt-4o (or whatever you configured) |
Claude Code sends claude-3-5-haiku-20241022 |
Maps to your SMALL_MODEL |
Uses gpt-4o-mini (or whatever you configured) |
- Python 3.9+
- API key for your chosen provider
- Claude Code CLI installed
- 2 minutes to configure
- Server runs on
http://localhost:8082 - Maps haiku → SMALL_MODEL, sonnet/opus → BIG_MODEL
- Supports streaming, function calling, images
# Quick test
python src/test_claude_to_openai.pyThat's it! Now Claude Code can use any OpenAI-compatible provider! 🎉