Skip to content

Commit 8805614

Browse files
authored
Merge pull request #7 from redis-developer/extract-memories
Extract memories, better summarization, etc.
2 parents 837387a + 159f7d4 commit 8805614

29 files changed

+4163
-963
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -221,3 +221,4 @@ libs/redis/docs/.Trash*
221221
.python-version
222222
.idea/*
223223
.vscode/settings.json
224+
.cursor

Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,4 +28,4 @@ ENTRYPOINT []
2828

2929

3030
# Run the API server
31-
CMD ["python", "-m", "agent_memory_server.main"]
31+
CMD ["uv", "run", "agent-memory", "api"]

README.md

Lines changed: 57 additions & 63 deletions
Original file line numberDiff line numberDiff line change
@@ -197,6 +197,18 @@ Example:
197197
agent-memory task-worker --concurrency 5 --redelivery-timeout 60
198198
```
199199

200+
#### `rebuild_index`
201+
Rebuilds the search index for Redis Memory Server.
202+
```bash
203+
agent-memory rebuild_index
204+
```
205+
206+
#### `migrate_memories`
207+
Runs data migrations. Migrations are reentrant.
208+
```bash
209+
agent-memory migrate_memories
210+
```
211+
200212
To see help for any command, you can use `--help`:
201213
```bash
202214
agent-memory --help
@@ -207,25 +219,39 @@ agent-memory mcp --help
207219

208220
## Getting Started
209221

210-
### Local Install
222+
### Installation
211223

212224
First, you'll need to download this repository. After you've downloaded it, you can install and run the servers.
213225

214-
1. Install the package and required dependencies with pip, ideally into a virtual environment:
215-
```bash
216-
pip install -e .
217-
```
218-
219-
**NOTE:** This project uses `uv` for dependency management, so if you have uv installed, you can run `uv sync` instead of `pip install ...` to install the project's dependencies.
226+
This project uses [uv](https://github.com/astral-sh/uv) for dependency management.
220227

221-
2 (a). The easiest way to start the REST API server and MCP server in SSE mode is to use Docker Compose. See the Docker Compose section of this file for more details.
228+
1. Install uv:
229+
```bash
230+
pip install uv
231+
```
222232

223-
2 (b). You can also run the REST API and MCP servers directly, e.g.:
224-
#### REST API (direct, without CLI)
233+
2. Install the package and required dependencies:
225234
```bash
226-
python -m agent_memory_server.main
235+
uv sync
227236
```
228237

238+
2. Set up environment variables (see Configuration section)
239+
240+
### Running
241+
242+
The easiest way to start the REST and MCP servers is to use Docker Compose. See the Docker Compose section of this file for more details.
243+
244+
But you can also run these servers via the CLI commands. Here's how you
245+
run the REST API server:
246+
```bash
247+
uv run agent-memory api
248+
```
249+
250+
And the MCP server:
251+
```
252+
uv run agent-memory mcp --mode <stdio|sse>
253+
```
254+
229255
**NOTE:** With uv, prefix the command with `uv`, e.g.: `uv run agent-memory --mode sse`. If you installed from source, you'll probably need to add `--directory` to tell uv where to find the code: `uv run --directory <path/to/checkout> run agent-memory --mode stdio`.
230256

231257
### Docker Compose
@@ -293,52 +319,12 @@ Cursor's MCP config is similar to Claude's, but it also supports SSE servers, so
293319

294320
## Configuration
295321

296-
You can configure the service using environment variables:
297-
298-
| Variable | Description | Default |
299-
|----------|-------------|---------|
300-
| `REDIS_URL` | URL for Redis connection | `redis://localhost:6379` |
301-
| `LONG_TERM_MEMORY` | Enable/disable long-term memory | `True` |
302-
| `WINDOW_SIZE` | Maximum messages in short-term memory | `20` |
303-
| `OPENAI_API_KEY` | API key for OpenAI | - |
304-
| `ANTHROPIC_API_KEY` | API key for Anthropic | - |
305-
| `GENERATION_MODEL` | Model for text generation | `gpt-4o-mini` |
306-
| `EMBEDDING_MODEL` | Model for text embeddings | `text-embedding-3-small` |
307-
| `PORT` | REST API server port | `8000` |
308-
| `TOPIC_MODEL` | BERTopic model for topic extraction | `MaartenGr/BERTopic_Wikipedia` |
309-
| `NER_MODEL` | BERT model for NER | `dbmdz/bert-large-cased-finetuned-conll03-english` |
310-
| `ENABLE_TOPIC_EXTRACTION` | Enable/disable topic extraction | `True` |
311-
| `ENABLE_NER` | Enable/disable named entity recognition | `True` |
312-
| `MCP_PORT` | MCP server port |9000|
313-
322+
You can configure the MCP and REST servers and task worker using environment
323+
variables. See the file `config.py` for all the available settings.
314324

315-
## Development
316-
317-
### Installation
318-
319-
This project uses [uv](https://github.com/astral-sh/uv) for dependency management.
320-
321-
1. Install dependencies:
322-
```bash
323-
uv sync --all-extras
324-
```
325-
326-
2. Set up environment variables (see Configuration section)
327-
328-
3. Run the API server:
329-
```bash
330-
agent-memory api
331-
```
332-
333-
4. In a separate terminal, run the MCP server (use either the "stdio" or "sse" options to set the running mode) if you want to test with tools like Cursor or Claude:
334-
```bash
335-
agent-memory mcp --mode <stdio|sse>
336-
```
337-
338-
### Running Tests
339-
```bash
340-
python -m pytest
341-
```
325+
The names of the settings map directly to an environment variable, so for
326+
example, you can set the `openai_api_key` setting with the `OPENAI_API_KEY`
327+
environment variable.
342328

343329
## Running the Background Task Worker
344330

@@ -382,16 +368,24 @@ agent-memory schedule-task "agent_memory_server.long_term_memory.compact_long_te
382368
- **Semantic Deduplication**: Finds and merges memories with similar meaning using vector search
383369
- **LLM-powered Merging**: Uses language models to intelligently combine memories
384370

371+
## Running Migrations
372+
When the data model changes, we add a migration in `migrations.py`. You can run
373+
these to make sure your schema is up to date, like so:
374+
375+
```bash
376+
uv run agent-memory migrate-memories
377+
```
378+
379+
## Development
380+
381+
### Running Tests
382+
```bash
383+
uv run pytest
384+
```
385+
385386
## Contributing
386387
1. Fork the repository
387388
2. Create a feature branch
388389
3. Commit your changes
389390
4. Push to the branch
390391
5. Create a Pull Request
391-
392-
### Running Tests
393-
394-
```bash
395-
# Run all tests
396-
pytest tests
397-
```

agent_memory_server/api.py

Lines changed: 12 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -223,12 +223,16 @@ async def search_long_term_memory(payload: SearchPayload):
223223
# Extract filter objects from the payload
224224
filters = payload.get_filters()
225225

226-
# Pass text, redis, and filter objects to the search function
227-
return await long_term_memory.search_long_term_memories(
228-
redis=redis,
229-
text=payload.text,
230-
distance_threshold=payload.distance_threshold,
231-
limit=payload.limit,
232-
offset=payload.offset,
226+
kwargs = {
227+
"redis": redis,
228+
"distance_threshold": payload.distance_threshold,
229+
"limit": payload.limit,
230+
"offset": payload.offset,
233231
**filters,
234-
)
232+
}
233+
234+
if payload.text:
235+
kwargs["text"] = payload.text
236+
237+
# Pass text, redis, and filter objects to the search function
238+
return await long_term_memory.search_long_term_memories(**kwargs)

agent_memory_server/cli.py

Lines changed: 41 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,11 @@
1313

1414
from agent_memory_server.config import settings
1515
from agent_memory_server.logging import configure_logging, get_logger
16+
from agent_memory_server.migrations import (
17+
migrate_add_discrete_memory_extracted_2,
18+
migrate_add_memory_hashes_1,
19+
migrate_add_memory_type_3,
20+
)
1621
from agent_memory_server.utils.redis import ensure_search_index_exists, get_redis_conn
1722

1823

@@ -34,17 +39,51 @@ def version():
3439
click.echo(f"agent-memory-server version {VERSION}")
3540

3641

42+
@cli.command()
43+
def rebuild_index():
44+
"""Rebuild the search index."""
45+
import asyncio
46+
47+
async def setup_and_run():
48+
redis = await get_redis_conn()
49+
await ensure_search_index_exists(redis, overwrite=True)
50+
51+
asyncio.run(setup_and_run())
52+
53+
54+
@cli.command()
55+
def migrate_memories():
56+
"""Migrate memories from the old format to the new format."""
57+
import asyncio
58+
59+
click.echo("Starting memory migrations...")
60+
61+
async def run_migrations():
62+
redis = await get_redis_conn()
63+
migrations = [
64+
migrate_add_memory_hashes_1,
65+
migrate_add_discrete_memory_extracted_2,
66+
migrate_add_memory_type_3,
67+
]
68+
for migration in migrations:
69+
await migration(redis=redis)
70+
71+
asyncio.run(run_migrations())
72+
73+
click.echo("Memory migrations completed successfully.")
74+
75+
3776
@cli.command()
3877
@click.option("--port", default=settings.port, help="Port to run the server on")
3978
@click.option("--host", default="0.0.0.0", help="Host to run the server on")
4079
@click.option("--reload", is_flag=True, help="Enable auto-reload")
4180
def api(port: int, host: str, reload: bool):
4281
"""Run the REST API server."""
43-
from agent_memory_server.main import app, on_start_logger
82+
from agent_memory_server.main import on_start_logger
4483

4584
on_start_logger(port)
4685
uvicorn.run(
47-
app,
86+
"agent_memory_server.main:app",
4887
host=host,
4988
port=port,
5089
reload=reload,

agent_memory_server/client/api.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@
1313
CreatedAt,
1414
Entities,
1515
LastAccessed,
16+
MemoryType,
1617
Namespace,
1718
SessionId,
1819
Topics,
@@ -273,6 +274,7 @@ async def search_long_term_memory(
273274
last_accessed: LastAccessed | dict[str, Any] | None = None,
274275
user_id: UserId | dict[str, Any] | None = None,
275276
distance_threshold: float | None = None,
277+
memory_type: MemoryType | dict[str, Any] | None = None,
276278
limit: int = 10,
277279
offset: int = 0,
278280
) -> LongTermMemoryResults:
@@ -313,6 +315,8 @@ async def search_long_term_memory(
313315
last_accessed = LastAccessed(**last_accessed)
314316
if isinstance(user_id, dict):
315317
user_id = UserId(**user_id)
318+
if isinstance(memory_type, dict):
319+
memory_type = MemoryType(**memory_type)
316320

317321
# Apply default namespace if needed and no namespace filter specified
318322
if namespace is None and self.config.default_namespace is not None:
@@ -328,6 +332,7 @@ async def search_long_term_memory(
328332
last_accessed=last_accessed,
329333
user_id=user_id,
330334
distance_threshold=distance_threshold,
335+
memory_type=memory_type,
331336
limit=limit,
332337
offset=offset,
333338
)

agent_memory_server/config.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,10 +20,12 @@ class Settings(BaseSettings):
2020
mcp_port: int = 9000
2121

2222
# Topic and NER model settings
23-
topic_model: str = "MaartenGr/BERTopic_Wikipedia"
23+
topic_model_source: Literal["NER", "LLM"] = "LLM"
24+
topic_model: str = "MaartenGr/BERTopic_Wikipedia" # LLM model here if using LLM
2425
ner_model: str = "dbmdz/bert-large-cased-finetuned-conll03-english"
2526
enable_topic_extraction: bool = True
2627
enable_ner: bool = True
28+
top_k_topics: int = 3
2729

2830
# RedisVL Settings
2931
redisvl_distance_metric: str = "COSINE"

agent_memory_server/docket_tasks.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77
from docket import Docket
88

99
from agent_memory_server.config import settings
10+
from agent_memory_server.extraction import extract_discrete_memories
1011
from agent_memory_server.long_term_memory import (
1112
compact_long_term_memories,
1213
extract_memory_structure,
@@ -24,6 +25,7 @@
2425
summarize_session,
2526
index_long_term_memories,
2627
compact_long_term_memories,
28+
extract_discrete_memories,
2729
]
2830

2931

0 commit comments

Comments
 (0)