Skip to content
This repository was archived by the owner on Jul 17, 2025. It is now read-only.
/ svelte-chat-ui Public archive

Chat with LLMs. Bring your own key. Keyboard-centric, mobile friendly, searchable.

License

Notifications You must be signed in to change notification settings

iansinnott/svelte-chat-ui

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

svelte chat ui logo

Svelte Chat UI

[!NOTE] > This project is no longer maintained.

Raycast recently announced that you can bring your own API key. Raycast offers a much more polished and actively maintained interface for chatting with LLMs. This change fully addresses the original problem that this project was designed to solve.

I created this project to solve my own need: a fast, keyboard-centric way to chat with LLMs using my own API key. For many months, it was my primary AI chat app and it was a joy to build something that solved my own problem and was useful to others.

For that reason, I've decided to archive the project. The code will remain available on GitHub and the web app will stay online, but I will no longer be working on it.

Thank you to everyone who used the app, filed issues, and provided feedback. If you're interested in taking over the project, please feel free to reach out.

If you want more context: https://notes.iansinnott.com/blog/posts/Sunsetting+Prompta+-+My+LLM+Chat+App


Yet another interface for chatting with LLMs via API.

Website | Downloads | Launch App

Mobile Search chats Keyboard Centric Comments
mobile view fts keyboard centric comments

Features

  • Search all previous conversations (full-text!)
  • Sync your chat history across devices
  • Keyboard centric
  • Leave notes on responses, such as "working code!" or "not working"
  • Keep all your chat history stored locally
  • Search previous chat threads
  • Chat with the latest models (updated dynamically)
  • Use local LLMs like Llama, Mistral, etc
  • Customize the system message

How to use

Running on macOS

For macOS users you will need to right-click the app and select "Open" the first time you run it. This is because the app is signed but not notarized.

Right-click to open Now you can click "Open"
macOS open macOS open

Developing

bun is used for development. You cam try using yarn, bun, npm, etc but other package managers have not been tested and are not deliberately supported:

bun install
bun run dev

# To devlop the Tuari desktop app as well:
bun run dev:tauri

Building

To create a production version of your app:

bun run build

If you want to build only for the browser, ignoring the desktop app:

bun run ui:build-static

The advantage here is that you don't need any Rust dependencies which are required for building Tauri.

Releasing a new Version

bun run release

You will be prompted to enter a new version number. New versions that don't contain a suffix such as -beta or -alpha will be published to GitHub.

Built With

  • SQLite via vlcn/cr-sqlite - SQLite compiled to WASM running in the browser using CRDTs for conflict-free replication.
  • Tauri - A Rust-based alternative to Electron (Only used in desktop builds)
  • Svelte - Reactive UI framework

About

Chat with LLMs. Bring your own key. Keyboard-centric, mobile friendly, searchable.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published

Contributors 6