Skip to content

snakeoil.compression can't cleanly handle if the parallelization compressor changes during runtime #98

@ferringb

Description

@ferringb

See

try:
lbzip2_path = process.find_binary("lbzip2")
lbzip2_compress_args = (f"-n{multiprocessing.cpu_count()}",)
lbzip2_decompress_args = lbzip2_compress_args
parallelizable = True
except process.CommandNotFound:
lbzip2_path = None
parallelizable = False
lbzip2_compress_args = lbzip2_decompress_args = ()

Note that upon module import it detects lbzip2 and the code uses it from that point forward.

This will break if someone does something like merging bzip2 and unmerging lbzip2 without the module being reloaded. Alternatively, if one does a merge that adds lbzip2, it still won't use lbzip2 for that process w/out an explicit module reload.

This can be fixed via moving the check for the binary to invocation time; that'll also allow for pbzip2 support to be added properly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions