-
Notifications
You must be signed in to change notification settings - Fork 3
Open
Description
We currently do an inefficient blocking iteration similar to this:
for (const file of files) {
await doWork(file);
}
This won't fly in larger projects as it'll be far too slow
We should see if we can parallelise (multi-thread) the work. Something like this:
// runs 200 tasks at any one time (max)
// executes tasks in a worker or something
await mapAsyncWithLimitInWorkers(files, (file) => doWork(file), 200);
ideas welcome
Metadata
Metadata
Assignees
Labels
No labels