Skip to content

Parallelise file scanning/fixing #9

@43081j

Description

@43081j

We currently do an inefficient blocking iteration similar to this:

for (const file of files) {
  await doWork(file);
}

This won't fly in larger projects as it'll be far too slow

We should see if we can parallelise (multi-thread) the work. Something like this:

// runs 200 tasks at any one time (max)
// executes tasks in a worker or something
await mapAsyncWithLimitInWorkers(files, (file) => doWork(file), 200);

ideas welcome

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions