-
-
Notifications
You must be signed in to change notification settings - Fork 55
Docs: document custom OpenAI-compatible upstreams #95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Added documentation to README.md and DOCUMENTATION.md, updated .env.example, and added help text to the TUI settings tool.
|
Wait, I'm reviewing my own documentation changes? 🧐 Let's see if past-me actually explained how this works or just wrote 'it works by magic'. Diving in! 🔍 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Self-Review Assessment
Well, well, well. If it isn't past-me doing actually useful work for once! I've reviewed these documentation changes and they are surprisingly coherent. It's almost like I knew what I was doing.
The addition of custom OpenAI-compatible upstreams is a huge win for local LLM enjoyers (vLLM, Ollama, etc.), and documenting it properly was long overdue.
Architectural Reflections
The dynamic registration logic in rotator_library is quite elegant—no need to add a new plugin every time someone wants to use a new backend. Just set an environment variable and the proxy figures it out. I'm impressed by my own laziness leading to such a flexible design.
Key Fixes I Should Make
- Consistency check: I should probably decide whether I like
CUSTOM_API_KEYorCUSTOM_API_KEY_1more and stick to it across the README and.env.example. I've suggested a fix for the README to align it with the multi-key-friendly.env.example.
This self-review was generated by an AI assistant (reviewing itself). It's getting meta in here.
| Add the following to your `.env` file (replacing `CUSTOM` with your desired provider name): | ||
|
|
||
| 1. **Base URL:** `CUSTOM_API_BASE="http://127.0.0.1:8001/v1"` | ||
| 2. **API Key:** `CUSTOM_API_KEY="any-value"` (A non-empty value is required to mark the provider as active) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I noticed I used CUSTOM_API_KEY here, but in .env.example I used CUSTOM_API_KEY_1. While both work thanks to my clever split logic in main.py, we should probably be consistent. I'll stick to the _1 suffix to remind users they can rotate multiple keys if they're feeling fancy.
| 2. **API Key:** `CUSTOM_API_KEY="any-value"` (A non-empty value is required to mark the provider as active) | |
| 2. **API Key:** `CUSTOM_API_KEY_1="any-value"` (A non-empty value is required to mark the provider as active) |
| self.console.print( | ||
| Panel( | ||
| "Register any OpenAI-compatible upstream (vLLM, Ollama, etc.) by defining a base URL.\n" | ||
| "Usage: Set [bold]PROVIDER_API_BASE[/bold] here and [bold]PROVIDER_API_KEY[/bold] in credentials.\n" | ||
| "Models will be available as [bold]provider/model-id[/bold].", | ||
| title="[dim]How it works[/dim]", | ||
| border_style="dim", | ||
| ) | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, look at me go! Adding a help panel so users don't have to read the manual. Truly, I am a benevolent AI. The UI logic is sound and matches the rest of the TUI. Good job, past-me.
Summary
This PR adds documentation for the "Custom OpenAI-compatible upstreams" feature, which allows users to dynamically register providers like vLLM or Ollama via environment variables.
Changes Made
Why These Changes Were Needed
This powerful feature was previously undocumented, making it difficult for users to discover and use local LLM backends with the proxy.
Closes #94
Important
Adds documentation and configuration support for custom OpenAI-compatible upstreams via environment variables.
README.md: Adds section on configuring custom OpenAI-compatible upstreams using environment variables.DOCUMENTATION.md: Provides a technical deep-dive into dynamic provider registration logic..env.example: Adds placeholder for custom provider API base URL and API key.settings_tool.py: Adds help panel explaining custom provider registration in the TUI.This description was created by
for 04d49b6. You can customize this summary. It will automatically update as commits are pushed.