Skip to content

jeremy-wayland/polka

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Polka: Personal Ollama Knowledge Acquisition

Local Ollama chatbot with Neo4j-backed memory to build a private, persistent personal knowledge base — all on your machine.

Start building your own knowledge-base via locally run models.

┌───────────────────────────────┐
│  ai_terminal.py (CLI)         │
│  - REPL + meta commands       │
└────────────┬──────────────────┘
             │
    ┌────────┴───────┐     ┌────────────┐
    │    Ollama      │     │   Neo4j    │
    │  (Local LLM)   │     │ (Chat DB)  │
    └────────────────┘     └─────▲──────┘
                                  │
                          Messages & memory

Optional:
  Neo4j ──► polka/storage.py ──► Markdown files (e.g., Obsidian vault)

Features

  • 🤖 Local Ollama model integration (no API keys needed)
  • 🧠 Persistent chat memory with Neo4j graph database
  • 📊 Automatic session summarization and topic classification
  • 🔍 Vector embeddings for semantic search
  • 📝 Optional Obsidian vault synchronization
  • 🔒 Complete privacy - all data stays local
  • ⚡ Fast macOS terminal interface for quick brainstorming

Quick Start

Requirements

Setup

# 1) Start Ollama and pull a model
ollama serve
ollama pull qwen2:0.5b    # or llama3.2:3b, deepseek-r1:7b

# 2) Install Python deps with uv
uv sync

# 3) Initialize Neo4j schema
# Open Neo4j Browser at http://localhost:7474 and run the contents of schema.cypher

# 4) Run the AI Terminal
uv run polka --model qwen2:0.5b

Tip (macOS/Homebrew):

brew install neo4j
neo4j start

Context

  • AGENTS.md: High‑level agent behavior and guidelines that are injected into the system prompt at startup. Keep it concise; only the first ~3000 characters are used.
  • USER.md (optional): Your personal context (bio, goals, preferences, notes). This file is .gitignored and not committed; it’s merged before AGENTS.md when building the system prompt so your personal info can tailor responses locally.

Usage

Inside the terminal:

  • :help: show meta commands
  • :model NAME: switch Ollama model
  • :history: show this run's conversation
  • :clear: clear in-run memory
  • :q: quit (writes session summary + topics)

Markdown Archive (Obsidian or any folder)

Sessions are saved to Neo4j. You can also write session markdown files to any directory. I save mine to an Obsidian vault.

# Choose a folder (Obsidian vault or any markdown directory)
export OBSIDIAN_VAULT_PATH="/path/to/your/markdown/folder"

# Sync all sessions
uv run python -m polka.storage

# Or a specific session
uv run python -m polka.storage <session_id>

Links

Happy chatting! 🚀

About

Local Ollama Chatbot with Neo4j Knowledge Graph

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published