|
1 | | -# Memory |
| 1 | +# Hindsight API |
| 2 | + |
| 3 | +**Memory System for AI Agents** — Temporal + Semantic + Entity Memory Architecture using PostgreSQL with pgvector. |
| 4 | + |
| 5 | +Hindsight gives AI agents persistent memory that works like human memory: it stores facts, tracks entities and relationships, handles temporal reasoning ("what happened last spring?"), and forms opinions based on configurable disposition traits. |
| 6 | + |
| 7 | +## Installation |
| 8 | + |
| 9 | +```bash |
| 10 | +pip install hindsight-api |
| 11 | +``` |
| 12 | + |
| 13 | +## Quick Start |
| 14 | + |
| 15 | +### Run the Server |
| 16 | + |
| 17 | +```bash |
| 18 | +# Set your LLM provider |
| 19 | +export HINDSIGHT_API_LLM_PROVIDER=openai |
| 20 | +export HINDSIGHT_API_LLM_API_KEY=sk-xxxxxxxxxxxx |
| 21 | + |
| 22 | +# Start the server (uses embedded PostgreSQL by default) |
| 23 | +hindsight-api |
| 24 | +``` |
| 25 | + |
| 26 | +The server starts at http://localhost:8888 with: |
| 27 | +- REST API for memory operations |
| 28 | +- MCP server at `/mcp` for tool-use integration |
| 29 | + |
| 30 | +### Use the Python API |
| 31 | + |
| 32 | +```python |
| 33 | +from hindsight_api import MemoryEngine |
| 34 | + |
| 35 | +# Create and initialize the memory engine |
| 36 | +memory = MemoryEngine() |
| 37 | +await memory.initialize() |
| 38 | + |
| 39 | +# Create a memory bank for your agent |
| 40 | +bank = await memory.create_memory_bank( |
| 41 | + name="my-assistant", |
| 42 | + background="A helpful coding assistant" |
| 43 | +) |
| 44 | + |
| 45 | +# Store a memory |
| 46 | +await memory.retain( |
| 47 | + memory_bank_id=bank.id, |
| 48 | + content="The user prefers Python for data science projects" |
| 49 | +) |
| 50 | + |
| 51 | +# Recall memories |
| 52 | +results = await memory.recall( |
| 53 | + memory_bank_id=bank.id, |
| 54 | + query="What programming language does the user prefer?" |
| 55 | +) |
| 56 | + |
| 57 | +# Reflect with reasoning |
| 58 | +response = await memory.reflect( |
| 59 | + memory_bank_id=bank.id, |
| 60 | + query="Should I recommend Python or R for this ML project?" |
| 61 | +) |
| 62 | +``` |
| 63 | + |
| 64 | +## CLI Options |
| 65 | + |
| 66 | +```bash |
| 67 | +hindsight-api --help |
| 68 | + |
| 69 | +# Common options |
| 70 | +hindsight-api --port 9000 # Custom port (default: 8888) |
| 71 | +hindsight-api --host 127.0.0.1 # Bind to localhost only |
| 72 | +hindsight-api --workers 4 # Multiple worker processes |
| 73 | +hindsight-api --log-level debug # Verbose logging |
| 74 | +``` |
| 75 | + |
| 76 | +## Configuration |
| 77 | + |
| 78 | +Configure via environment variables: |
| 79 | + |
| 80 | +| Variable | Description | Default | |
| 81 | +|----------|-------------|---------| |
| 82 | +| `HINDSIGHT_API_DATABASE_URL` | PostgreSQL connection string | `pg0` (embedded) | |
| 83 | +| `HINDSIGHT_API_LLM_PROVIDER` | `openai`, `groq`, `gemini`, `ollama` | `openai` | |
| 84 | +| `HINDSIGHT_API_LLM_API_KEY` | API key for LLM provider | - | |
| 85 | +| `HINDSIGHT_API_LLM_MODEL` | Model name | `gpt-4o-mini` | |
| 86 | +| `HINDSIGHT_API_HOST` | Server bind address | `0.0.0.0` | |
| 87 | +| `HINDSIGHT_API_PORT` | Server port | `8888` | |
| 88 | + |
| 89 | +### Example with External PostgreSQL |
| 90 | + |
| 91 | +```bash |
| 92 | +export HINDSIGHT_API_DATABASE_URL=postgresql://user:pass@localhost:5432/hindsight |
| 93 | +export HINDSIGHT_API_LLM_PROVIDER=groq |
| 94 | +export HINDSIGHT_API_LLM_API_KEY=gsk_xxxxxxxxxxxx |
| 95 | + |
| 96 | +hindsight-api |
| 97 | +``` |
| 98 | + |
| 99 | +## Docker |
| 100 | + |
| 101 | +```bash |
| 102 | +docker run --rm -it -p 8888:8888 \ |
| 103 | + -e HINDSIGHT_API_LLM_API_KEY=$OPENAI_API_KEY \ |
| 104 | + -v $HOME/.hindsight-docker:/home/hindsight/.pg0 \ |
| 105 | + ghcr.io/vectorize-io/hindsight:latest |
| 106 | +``` |
| 107 | + |
| 108 | +## MCP Server |
| 109 | + |
| 110 | +For local MCP integration without running the full API server: |
| 111 | + |
| 112 | +```bash |
| 113 | +hindsight-local-mcp |
| 114 | +``` |
| 115 | + |
| 116 | +This runs a stdio-based MCP server that can be used directly with MCP-compatible clients. |
| 117 | + |
| 118 | +## Key Features |
| 119 | + |
| 120 | +- **Multi-Strategy Retrieval (TEMPR)** — Semantic, keyword, graph, and temporal search combined with RRF fusion |
| 121 | +- **Entity Graph** — Automatic entity extraction and relationship tracking |
| 122 | +- **Temporal Reasoning** — Native support for time-based queries |
| 123 | +- **Disposition Traits** — Configurable skepticism, literalism, and empathy influence opinion formation |
| 124 | +- **Three Memory Types** — World facts, bank actions, and formed opinions with confidence scores |
| 125 | + |
| 126 | +## Documentation |
| 127 | + |
| 128 | +Full documentation: [https://hindsight.vectorize.io](https://hindsight.vectorize.io) |
| 129 | + |
| 130 | +- [Installation Guide](https://hindsight.vectorize.io/developer/installation) |
| 131 | +- [Configuration Reference](https://hindsight.vectorize.io/developer/configuration) |
| 132 | +- [API Reference](https://hindsight.vectorize.io/api-reference) |
| 133 | +- [Python SDK](https://hindsight.vectorize.io/sdks/python) |
| 134 | + |
| 135 | +## License |
| 136 | + |
| 137 | +Apache 2.0 |
0 commit comments