A Rust-powered client library for interacting with large language models (LLMs). Supports streaming and non-streaming conversations with various API providers.
This is a work in progress and is not yet ready for production use.some providers are may not be fully supported yet.
checkout /src/models/models.rs for more details.
checkout /src/api/providers.rs for more details.
- 🚀 Async-first implementation using Tokio runtime
- 🌐 Multi-provider support (Deepseek, etc.)
- 📡 Stream response handling with backpressure support
- 🔧 Configurable API endpoints and rate limiting
- 🛠️ Strong type safety with Rust enums and structs
- 🧠 Conversation memory management
- 🚦 Comprehensive error handling
Add to your Cargo.toml
:
[dependencies]
llmhub = { git = "https://github.com/akirco/llmhub" }
checkout examples
cargo run --example llmhub_test
Feel free to open issues or pull requests.
MIT License