-
Notifications
You must be signed in to change notification settings - Fork 681
Description
Memory Leak: JavaScript heap out of memory with large conversation history
Environment
- claudecodeui version: 1.12.0
- Node.js version: v25.2.1
- OS: macOS 15.2 (Darwin 24.6.0)
- Memory configuration:
--max-old-space-size=512(default)
Problem Description
The server crashes with "JavaScript heap out of memory" when the ~/.claude/projects/ directory contains a large number of conversation files.
Error message:
FATAL ERROR: Ineffective mark-compacts near heap limit
Allocation failed - JavaScript heap out of memory
My data volume:
- 1,239 JSONL conversation files
- ~2 GB total conversation data
- Largest files: 31MB, 28MB, 20MB, 19MB, 15MB
The server crash-loops every ~40-60 seconds as launchd restarts it.
Root Cause Analysis
After investigation, I found several unbounded memory growth patterns in server/projects.js:
1. Unbounded allEntries accumulation (Lines 547-568)
const allEntries = [];
// ...
for (const { file } of filesWithStats) {
// ...
allEntries.push(...result.entries); // ⚠️ UNBOUNDED ACCUMULATION
}This loads ALL entries from ALL JSONL files into memory without any upper bound.
2. Full file parsing in parseJsonlSessions() (Line 668)
const entries = [];
// ... streams file but accumulates ALL entries
entries.push(entry);Each file's entries are accumulated in an array. For large files (10k+ messages), this creates significant memory pressure.
3. No cache eviction policy (Lines 197-203)
const projectDirectoryCache = new Map();
function clearProjectDirectoryCache() {
projectDirectoryCache.clear();
}The cache grows unbounded - no LRU, no size limits, no TTL.
4. Full file load for deletion (Lines 909-934)
const content = await fs.readFile(jsonlFile, "utf8"); // Entire file in memory
const lines = content.split("\n").filter((line) => line.trim());
// Creates 3+ copies of file in memory5. Session messages full load (Lines 819-861)
const messages = [];
// ... accumulates ALL messages for a session
messages.push(entry);No streaming to client, no chunking.
Workaround
Increase the Node.js heap size:
# In .env
NODE_OPTIONS=--max-old-space-size=2048
# Or in LaunchAgent plist
<key>NODE_OPTIONS</key>
<string>--max-old-space-size=2048</string>This mitigates the crash but doesn't fix the underlying memory leak.
Suggested Fixes
- Implement streaming pagination - Process files in reverse chronological order, stop when enough sessions found
- Add max entry limits - Cap
allEntriesarray size with early termination - Stream-based file rewriting - For deletion, use temporary file + streaming copy instead of full load
- Add cache size limits - Implement LRU cache with max ~100 entries
- Lazy message loading - Return message stream/iterator instead of full array
- Memory monitoring - Add
process.memoryUsage()logging to track growth
Expected Behavior
The server should handle large conversation histories (1000+ sessions, 10GB+ data) without crashing by:
- Loading data lazily/on-demand
- Implementing pagination at the data layer
- Using bounded caches with eviction policies
Steps to Reproduce
- Use Claude Code CLI extensively for several months (accumulate 1000+ conversations)
- Run claudecodeui server with default 512MB heap
- Server crashes within 1-2 minutes of startup
Additional Context
The "Early Exit Optimization" at lines 564-567 is insufficient:
if (allSessions.size >= (limit + offset) * 2 && allEntries.length >= Math.min(3, filesWithStats.length)) {
break;
}This only breaks after accumulating (limit + offset) * 2 sessions worth of data, which for high pagination offsets still loads massive amounts.