An all-in-one, AI-powered learning platform that adapts to you.
- Turn any topic into a full course in seconds. Upload your own documents to ground the AI with RAG, or let the model build from scratch.
- Create adaptive courses where the AI builds a dependency graph (DAG) and only unlocks new lessons once you’ve mastered the prerequisites.
- We don’t do boring flashcards. A LECTOR-based scheduler tracks concepts you struggle with and resurfaces them in future lessons exactly when you need them.
- Lessons are fully interactive. Like Claude Artifacts or the demos on Brilliant, you get hands-on widgets you can click and play with right inside the lesson.
- Executable code blocks are built in: run, tweak, and break code directly inside the lesson.
- Answer quick questions and self-assessment quizzes inside lessons; your responses continuously shape and adapt the course to you.
- Upload a PDF book or drop in a YouTube link and chat with it instantly.
- Run the whole stack 100% offline with Ollama, or plug in your own cloud API keys (supports many providers via LiteLLM).
- Persistent memory of your preferences and learning style, so it gets better at adapting to you the more you use it.
- Extensible via the Model Context Protocol (MCP), so you can connect your own tools and APIs to interpret data or take actions.
Prerequisites: Docker and Docker Compose installed.
Environment: All required .env values are defined directly in docker-compose.yml. You do not need backend/.env. To customize, edit the backend service environment block and optionally uncomment provider API keys.
- Clone the repo
git clone https://github.com/SamDc73/Talimio.git
cd Talimio
- Start the stack
docker compose up -d
- Open the apps
- Frontend: http://localhost:5173
- API (FastAPI): http://localhost:8080
- Optional: Pull Ollama models (first run)
- Skip if you use cloud LLMs and set provider keys in
docker-compose.yml.
docker exec -it ollama ollama pull gpt-oss:20b
docker exec -it ollama ollama pull nomic-embed-text
Note: To disable the local LLM, comment out the ollama service in docker-compose.yml and set PRIMARY_LLM_MODEL to a cloud model. Provide the relevant API key(s) in the same environment block.
- Stop/Update the stack
docker compose down
# update later
docker compose pull && docker compose up -d
- Clone the repo
git clone https://github.com/SamDc73/Talimio.git
cd Talimio
- Backend (Python 3.12+, uv):
cd backend
uv sync
cp .env.example .env
uv run uvicorn src.main:app --reload --port 8080
- Frontend (Node + pnpm):
- Now in a diffrent tab/window:
cd web
cp .env.example .env
pnpm install
pnpm dev
- Now you can open the apps:
- Frontend: http://localhost:5173
- API (FastAPI): http://localhost:8080
Any type of contribution is greatly apprietiated!
- Pick one tiny improvement
- Fix a typo, clarify a log/error message, tidy a function name, or improve copy.
- Good places:
README.md,backend/src/**(messages/docs),web/src/**(copy/UI nits).
- Run it locally
- Use Quick Start above, and test it out.
- Run checks before PR
- Backend:
cd backend && ruff check src --fix - Backend types:
cd backend && uvx ty check src - Frontend:
cd frontend && pnpm run lint
- Open a draft PR
- Two bullets are enough: why it helps, what changed.
Questions, help, or feedback? Join our Discord
