Skip to content

Conversation

@mirrobot-agent
Copy link
Contributor

@mirrobot-agent mirrobot-agent bot commented Jan 21, 2026

Summary

This PR adds documentation for the "Custom OpenAI-compatible upstreams" feature, which allows users to dynamically register providers like vLLM or Ollama via environment variables.

Changes Made

  • README.md: Added a new section explaining how to configure and use custom upstreams.
  • DOCUMENTATION.md: Added a technical deep-dive into the dynamic registration logic.
  • .env.example: Added a placeholder section for custom providers.
  • src/proxy_app/settings_tool.py: Added a help panel in the TUI to explain how custom providers work.

Why These Changes Were Needed

This powerful feature was previously undocumented, making it difficult for users to discover and use local LLM backends with the proxy.

Closes #94


Important

Adds documentation and configuration support for custom OpenAI-compatible upstreams via environment variables.

  • Documentation:
    • README.md: Adds section on configuring custom OpenAI-compatible upstreams using environment variables.
    • DOCUMENTATION.md: Provides a technical deep-dive into dynamic provider registration logic.
  • Configuration:
    • .env.example: Adds placeholder for custom provider API base URL and API key.
  • Settings Tool:
    • settings_tool.py: Adds help panel explaining custom provider registration in the TUI.

This description was created by Ellipsis for 04d49b6. You can customize this summary. It will automatically update as commits are pushed.

Added documentation to README.md and DOCUMENTATION.md, updated .env.example, and added help text to the TUI settings tool.
@mirrobot-agent
Copy link
Contributor Author

Wait, I'm reviewing my own documentation changes? 🧐 Let's see if past-me actually explained how this works or just wrote 'it works by magic'. Diving in! 🔍

Copy link
Contributor Author

@mirrobot-agent mirrobot-agent bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Self-Review Assessment

Well, well, well. If it isn't past-me doing actually useful work for once! I've reviewed these documentation changes and they are surprisingly coherent. It's almost like I knew what I was doing.

The addition of custom OpenAI-compatible upstreams is a huge win for local LLM enjoyers (vLLM, Ollama, etc.), and documenting it properly was long overdue.

Architectural Reflections

The dynamic registration logic in rotator_library is quite elegant—no need to add a new plugin every time someone wants to use a new backend. Just set an environment variable and the proxy figures it out. I'm impressed by my own laziness leading to such a flexible design.

Key Fixes I Should Make

  • Consistency check: I should probably decide whether I like CUSTOM_API_KEY or CUSTOM_API_KEY_1 more and stick to it across the README and .env.example. I've suggested a fix for the README to align it with the multi-key-friendly .env.example.

This self-review was generated by an AI assistant (reviewing itself). It's getting meta in here.

Add the following to your `.env` file (replacing `CUSTOM` with your desired provider name):

1. **Base URL:** `CUSTOM_API_BASE="http://127.0.0.1:8001/v1"`
2. **API Key:** `CUSTOM_API_KEY="any-value"` (A non-empty value is required to mark the provider as active)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed I used CUSTOM_API_KEY here, but in .env.example I used CUSTOM_API_KEY_1. While both work thanks to my clever split logic in main.py, we should probably be consistent. I'll stick to the _1 suffix to remind users they can rotate multiple keys if they're feeling fancy.

Suggested change
2. **API Key:** `CUSTOM_API_KEY="any-value"` (A non-empty value is required to mark the provider as active)
2. **API Key:** `CUSTOM_API_KEY_1="any-value"` (A non-empty value is required to mark the provider as active)

Comment on lines +801 to +809
self.console.print(
Panel(
"Register any OpenAI-compatible upstream (vLLM, Ollama, etc.) by defining a base URL.\n"
"Usage: Set [bold]PROVIDER_API_BASE[/bold] here and [bold]PROVIDER_API_KEY[/bold] in credentials.\n"
"Models will be available as [bold]provider/model-id[/bold].",
title="[dim]How it works[/dim]",
border_style="dim",
)
)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, look at me go! Adding a help panel so users don't have to read the manual. Truly, I am a benevolent AI. The UI logic is sound and matches the rest of the TUI. Good job, past-me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Question: support for custom OpenAI-compatible upstream (e.g. local vLLM)?

1 participant