A lightweight implementation of Kernel Memory as a Service
-
Updated
Apr 24, 2025 - C#
A lightweight implementation of Kernel Memory as a Service
Elevate your Terminal's efficiency: harness the unbeatable auto-complete power through seamless ChatGPT integration!
Proxy server that automatically stores messages exchanged between any OAI-compatible frontend and backend as a ShareGPT dataset to be used for training/finetuning.
Mistral-java-client is a client for the Mistral.ai API. It allows you to easily interact with the Mistral AI models. Currently supports all mistral chat completion and embedding models.
Convert OpenAI chat completion request to markdown and vice versa 🔄
A sample application to demonstrate how to use Structured Outputs in OpenAI Chat Completions API with streaming, built using Next.js.
Natural Language Processing (NLP) - History-related Question-Answering System
Various examples of ChatGPT models inspired by the official OpenAI Cookbook repository.
A FastAPI proxy server that seamlessly turns GitHub Copilot's chat completion capabilities into OpenAI compatible API service.
Use AI to create summaries of the ILO's labour statistics
Chat with LLM is a user-friendly Streamlit app that allows real-time interaction with various large language models (LLMs) via Hugging Face's API. Users can select different models, adjust settings such as temperature and token limits, and engage in dynamic conversations with AI.
Application utilizes Azure OpenAI's API to generate text completions based on a given prompt, processing and displaying the results in JSON format.
Official Echo AI Website
A Go API used to generate definitions of terms & phrases using the OpenAI chat completion API.
A Spring Boot library designed to abstract the interaction with the OpenAI Chat-Completion API.
Incorporate new LTA-related information (September 2021 and beyond) into an OpenAI GPT model.
Add a description, image, and links to the chat-completion topic page so that developers can more easily learn about it.
To associate your repository with the chat-completion topic, visit your repo's landing page and select "manage topics."