Skip to content

LLM Link is a universal LLM proxy service providing seamless access to many llm providers through multiple API formats, with built-in optimizations for AI coding tools and support for both standalone service and Rust library usage

License

Notifications You must be signed in to change notification settings

lipish/llm-link

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LLM Link

Crates.io Documentation License: MIT Downloads

πŸš€ A user-friendly LLM proxy service with built-in support for popular AI coding tools

LLM Link provides zero-configuration access to LLM providers through multiple API formats, with optimized built-in support for popular AI applications.

✨ Key Features

  • 🎯 Application-Oriented: Built-in configurations for popular AI coding tools
  • ⚑ Zero Configuration: One-command startup for common use cases
  • πŸ”„ Multi-Protocol: Simultaneous OpenAI, Ollama, and Anthropic API support
  • πŸ”€ 9 LLM Providers: OpenAI, Anthropic, Zhipu, Aliyun, Volcengine, Tencent, Longcat, Moonshot, Ollama
  • πŸ“‘ Dynamic Model Discovery: REST API to query all supported providers and models
  • ** Hot-Reload Configuration**: Update API keys and switch providers without restart
  • ** Production Ready**: Built with Rust for performance and reliability

🎯 Supported Applications

Application Protocol Port Authentication Status
Codex CLI OpenAI API 8088 Bearer Token βœ… Ready
Zed Ollama API 11434 None βœ… Ready
Aider OpenAI API 8090 Bearer Token βœ… Ready
OpenHands OpenAI API 8091 Bearer Token βœ… Ready

οΏ½ Full Application Documentation β†’

οΏ½ Quick Start

Installation

# Install from crates.io (Recommended)
cargo install llm-link

# Or via Homebrew (macOS)
brew tap lipish/llm-link && brew install llm-link

# Or via pip (macOS / Linux)
pip install pyllmlink

οΏ½ Complete Installation Guide β†’

Basic Usage

# For Codex CLI
./llm-link --app codex-cli --api-key "your-auth-token"

# For Zed
./llm-link --app zed

# For Aider (using open-source models)
./llm-link --app aider --provider zhipu --model glm-4.6 --api-key "your-zhipu-key"

# For OpenHands
./llm-link --app openhands --provider anthropic --model claude-3-5-sonnet --api-key "your-anthropic-key"

πŸ“š Detailed Configuration Guide β†’

πŸ“‹ Help & Information

# List all supported applications
./llm-link --list-apps

# Get detailed setup guide for specific application
./llm-link --app-info aider

# List available models for a provider
./llm-link --provider zhipu --list-models

🌐 Protocol Mode

Use multiple protocols simultaneously for maximum flexibility:

./llm-link --protocols openai,ollama,anthropic --provider zhipu --model glm-4.6

οΏ½ Protocol Mode Documentation β†’

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   AI Tools      β”‚    β”‚   LLM Link      β”‚    β”‚   LLM Providers β”‚
β”‚                 β”‚    β”‚                 β”‚    β”‚                 β”‚
β”‚ β€’ Codex CLI     │───▢│ β€’ Protocol      │───▢│ β€’ OpenAI        β”‚
β”‚ β€’ Zed IDE       β”‚    β”‚   Conversion    β”‚    β”‚ β€’ Anthropic     β”‚
β”‚ β€’ Aider         β”‚    β”‚ β€’ Format        β”‚    β”‚ β€’ Zhipu         β”‚
β”‚ β€’ OpenHands     β”‚    β”‚   Adaptation    β”‚    β”‚ β€’ Aliyun        β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ“š Architecture Documentation β†’

πŸ”§ Advanced Usage

Custom Configuration

# Custom port and host
./llm-link --app aider --provider zhipu --model glm-4.6 --port 8095 --host 0.0.0.0

# With authentication
./llm-link --app aider --provider zhipu --model glm-4.6 --auth-key "your-secret-token"

Environment Variables

# Provider API keys
export ZHIPU_API_KEY="your-zhipu-api-key"
export OPENAI_API_KEY="sk-xxx"
export ANTHROPIC_API_KEY="sk-ant-xxx"

# LLM Link authentication
export LLM_LINK_API_KEY="your-auth-token"

πŸ“š Advanced Configuration β†’

πŸ§ͺ Testing

# Test health endpoint
curl http://localhost:8090/health

# Test OpenAI API
curl -X POST http://localhost:8090/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer your-token" \
  -d '{"model": "glm-4.6", "messages": [{"role": "user", "content": "Hello!"}]}'

πŸ“š Testing & Troubleshooting β†’

πŸ“š Full Documentation

🌐 Complete Documentation Site β†’

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ”— Links

About

LLM Link is a universal LLM proxy service providing seamless access to many llm providers through multiple API formats, with built-in optimizations for AI coding tools and support for both standalone service and Rust library usage

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •