Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
59 changes: 20 additions & 39 deletions src/common/createPreloader.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,52 +5,33 @@ export function createPreloader<T, U extends unknown[]>(
fun: (...args: U) => Promise<T>
) {
let timeout: undefined | NodeJS.Timeout;
let waiting: ((value: T | PromiseLike<T>) => void)[] = [];
let argsStr: string | null = null;
let data: T | null = null;
let dataSavedAt: number | null = null;

const preload = (...args: U) => {
if (timeout) {
clearTimeout(timeout);
}
timeout = setTimeout(() => {
run(...args);
}, 200);
};
const cache = new Map<string, { data: T; savedAt: number }>();
const activeRequest = new Map<string, number>();

const run = (...args: U) => {
const newArgsStr = JSON.stringify(args);

if (argsStr !== newArgsStr) {
waiting = [];
argsStr = newArgsStr;
data = null;
const key = JSON.stringify(args);
const now = Date.now();
const cached = cache.get(key);
if (cached && now - cached.savedAt < 10000) {
return Promise.resolve(cached.data);
}

return new Promise<T>((resolve) => {
if (data && argsStr === newArgsStr) {
if (Date.now() - dataSavedAt! < 10000) {
resolve(data);
return;
}
data = null;
}
if (waiting.length) {
waiting.push(resolve);
return;
}
waiting.push(resolve);
const token = (activeRequest.get(key) || 0) + 1;
activeRequest.set(key, token);

fun(...args).then((newData) => {
data = newData;
dataSavedAt = Date.now();
if (argsStr !== newArgsStr) return;
waiting.forEach((resolve) => resolve(newData));
waiting = [];
});
return fun(...args).then((result) => {
if (activeRequest.get(key) === token) {
cache.set(key, { data: result, savedAt: Date.now() });
}
return result;
});
};
Comment on lines +8 to 28
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Implement cache eviction to prevent unbounded memory growth.

Both cache and activeRequest maps grow indefinitely, which can cause memory leaks in long-running web applications. Consider implementing:

  1. Maximum cache size with LRU eviction
  2. Periodic cleanup of expired cache entries
  3. Cleanup of activeRequest entries after completion

Example cache cleanup:

const run = (...args: U) => {
  const key = JSON.stringify(args);
  const now = Date.now();
  
  // Clean expired entries periodically
  if (cache.size > 100) {
    for (const [k, v] of cache.entries()) {
      if (now - v.savedAt >= 10000) {
        cache.delete(k);
        activeRequest.delete(k);
      }
    }
  }
  
  // ... rest of implementation
};
🤖 Prompt for AI Agents
In src/common/createPreloader.ts around lines 8 to 28, the cache and
activeRequest maps grow unbounded causing memory leaks; implement bounded cache
size with LRU eviction, periodic cleanup of expired entries, and ensure
activeRequest entries are removed when a request finishes. Add a maxEntries
constant (e.g. 100), track access order (or use a Map where recently used keys
are moved to the end) and evict oldest entries when cache.size > maxEntries; on
each run call perform a quick expired-entry sweep (only when size exceeds
threshold or on a timer) that removes entries older than the TTL and also
deletes their activeRequest entries; finally, after fun(...args) resolves or
rejects, always remove the key from activeRequest (and only set cache when the
token matches) so activeRequest cannot grow indefinitely.


const preload = (...args: U) => {
if (timeout) clearTimeout(timeout);
timeout = setTimeout(() => run(...args), 200);
};

return { run, preload };
}

Expand Down