π OpenLLM Monitor β Real-Time Observability Dashboard for LLM APIs (OpenAI, Ollama, OpenRouter) #1166
prajeesh-chavan
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey everyone π
I just built and open-sourced OpenLLM Monitor, a plug-and-play dashboard that gives developers full visibility into how their LLM prompts are performing β cost, latency, token usage, retries, and more.
π§ What It Does
π Built with
Node.js Β· Express Β· MongoDB Β· React Β· Tailwind Β· WebSockets
π§ Why I Built It
I found myself debugging LLM completions with zero visibility β no logs, no token breakdown, no replay. This tool solves that pain, especially for indie devs and teams building GenAI apps.
π GitHub:
github.com/prajeesh-chavan/openllm-monitor
Would love to hear:
Happy to collaborate if you're building something in this space!
β Prajeesh Chavan π
Beta Was this translation helpful? Give feedback.
All reactions