RikkaHub is an Android APP that supports for multiple LLM providers.
-
Updated
Mar 26, 2026 - Kotlin
RikkaHub is an Android APP that supports for multiple LLM providers.
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
Kangaroo is an AI-powered SQL client and admin tool for popular databases (MariaDB / MySQL / Oracle / PostgreSQL / Redis / SQLite / SQL Server / ...) on Windows / MacOS / Linux, support table design, query, model, sync, export/import etc, focus on comfortable, fun and developer friendly.
Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
LLMX; Easiest 3rd party Local LLM UI for the web!
A single-file tkinter-based Ollama GUI project with no external dependencies.
A UI client for Ollama written in Compose Multiplatform
Odin Runes, a java-based GPT client, facilitates interaction with your preferred GPT model right through your favorite text editor. There is more: It also facilitates prompt-engineering by extracting context from diverse sources using technologies such as OCR, enhancing overall productivity and saving costs.
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
OllamaFX is a native, lightweight, and professional JavaFX desktop client for Ollama. Run Llama 3, Mistral, and Phi-3 locally with maximum privacy, low RAM overhead, and a powerful interface designed for power users and developers.
Simple html ollama chatbot that is easy to install. Simply copy the html file on your computer and run it.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy‑first Chrome extension to chat with local LLMs via Ollama, LM Studio, and llama.cpp. Supports streaming, stop/regenerate, RAG, and easy model switching — all without cloud APIs or data leaks.
Cortex is a private, secure, and highly responsive desktop AI assistant designed for seamless interaction with local Large Language Models (LLMs) through the Ollama framework. All models and data stay on your device, no cloud, no third parties.
🚀 A powerful Flutter-based AI chat application that lets you run LLMs directly on your mobile device or connect to local model servers. Features offline model execution, Ollama/LLMStudio integration, and a beautiful modern UI. Privacy-focused, cross-platform, and fully open source.
ollama gui desktop
PuPu is a lightweight, cross-platform desktop AI client that works with both local and cloud-hosted models. Whether you prefer running models on your own machine or connecting to providers like OpenAI and Anthropic, PuPu gives you a unified, elegant interface — your AI, your rules.
ollama client for android
Add a description, image, and links to the ollama-ui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-ui topic, visit your repo's landing page and select "manage topics."