Local Ollama chatbot with Neo4j-backed memory to build a private, persistent personal knowledge base — all on your machine.
Start building your own knowledge-base via locally run models.
┌───────────────────────────────┐
│ ai_terminal.py (CLI) │
│ - REPL + meta commands │
└────────────┬──────────────────┘
│
┌────────┴───────┐ ┌────────────┐
│ Ollama │ │ Neo4j │
│ (Local LLM) │ │ (Chat DB) │
└────────────────┘ └─────▲──────┘
│
Messages & memory
Optional:
Neo4j ──► polka/storage.py ──► Markdown files (e.g., Obsidian vault)
- 🤖 Local Ollama model integration (no API keys needed)
- 🧠 Persistent chat memory with Neo4j graph database
- 📊 Automatic session summarization and topic classification
- 🔍 Vector embeddings for semantic search
- 📝 Optional Obsidian vault synchronization
- 🔒 Complete privacy - all data stays local
- ⚡ Fast macOS terminal interface for quick brainstorming
Requirements
- Python 3.10+ (I develop with 3.13)
- Ollama: https://ollama.com
- Neo4j (Desktop recommended to visualize the knowledge graph): https://neo4j.com/download/
- uv for dependency management: https://github.com/astral-sh/uv
Setup
# 1) Start Ollama and pull a model
ollama serve
ollama pull qwen2:0.5b # or llama3.2:3b, deepseek-r1:7b
# 2) Install Python deps with uv
uv sync
# 3) Initialize Neo4j schema
# Open Neo4j Browser at http://localhost:7474 and run the contents of schema.cypher
# 4) Run the AI Terminal
uv run polka --model qwen2:0.5bTip (macOS/Homebrew):
brew install neo4j
neo4j start- AGENTS.md: High‑level agent behavior and guidelines that are injected into the system prompt at startup. Keep it concise; only the first ~3000 characters are used.
- USER.md (optional): Your personal context (bio, goals, preferences, notes). This file is .gitignored and not committed; it’s merged before AGENTS.md when building the system prompt so your personal info can tailor responses locally.
Inside the terminal:
:help: show meta commands:model NAME: switch Ollama model:history: show this run's conversation:clear: clear in-run memory:q: quit (writes session summary + topics)
Sessions are saved to Neo4j. You can also write session markdown files to any directory. I save mine to an Obsidian vault.
# Choose a folder (Obsidian vault or any markdown directory)
export OBSIDIAN_VAULT_PATH="/path/to/your/markdown/folder"
# Sync all sessions
uv run python -m polka.storage
# Or a specific session
uv run python -m polka.storage <session_id>- Ollama: https://ollama.com
- Neo4j Desktop: https://neo4j.com/download/
- uv: https://github.com/astral-sh/uv
- Obsidian: https://obsidian.md
Happy chatting! 🚀