-
Notifications
You must be signed in to change notification settings - Fork 13
Description
Problem Statement
Currently, Rockets does not support AI functionality. Many applications today rely on OpenAI’s API directly, making them dependent on a single AI provider.
There is no built-in way for Rockets to:
- Intercept AI requests and dynamically route them to different providers.
- Provide a unified AI interface that allows easy switching between AI backends.
- Extend AI capabilities to other Rockets modules, enabling AI-powered automation within the framework.
- Describe the solution you'd like
Package Proposal
I propose adding a new AI module to Rockets that:
- Acts as a drop-in replacement for OpenAI’s API, so any application calling OpenAI can instead use Rockets without modifications.
- Implements a flexible AI service layer that supports multiple backends (e.g., OpenAI, Anthropic, local LLMs, custom AI models).
- Follows OpenAI’s API guidelines for endpoints while remaining configurable for other AI services.
- Can be integrated into other Rockets modules, allowing AI-enhanced functionality across the framework.
Implementation Approach
API Endpoints Proposal
The AI module should expose the following OpenAI-compatible endpoints:
POST /v1/chat/completions → Create a new AI-generated chat response.
GET /v1/chat/completions/{completion_id} → Retrieve a specific response by its ID.
GET /v1/chat/completions → List all generated chat responses.
DELETE /v1/chat/completions/{completion_id} → Delete an AI-generated chat response by its ID.
Mirror OpenAI’s API structure to ensure compatibility.
Allow dynamic configuration of AI providers, enabling easy switching between different models.
Be designed for performance and scalability, ensuring low-latency AI responses.
Describe alternatives you've considered
Letting applications integrate OpenAI directly, which locks them into a single AI provider.
Using third-party AI abstraction layers, but this adds unnecessary external dependencies.
Additional context
- This feature would make Rockets a powerful AI middleware, enabling vendor-agnostic AI integration.
- Developers could switch AI models without changing their application code, reducing lock-in risks.
- Other Rockets modules could leverage this AI capability to enhance automation, decision-making, and user interactions.
- Once implemented we should be able to generate a sdk following open ai Design, like...
import { RocketsAI } from 'rocketsai-sdk';
await RocketsAI.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "Hello!" }]
});
await RocketsAI.chat.completions.retrieve("abc123");
await RocketsAI.chat.completions.list();
await RocketsAI.chat.completions.delete("abc123");