Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use of unsafe PyTorch save #14

Open
ways opened this issue Jan 29, 2025 · 3 comments · May be fixed by #27
Open

Use of unsafe PyTorch save #14

ways opened this issue Jan 29, 2025 · 3 comments · May be fixed by #27
Assignees
Labels

Comments

@ways
Copy link
Contributor

ways commented Jan 29, 2025

$ bandit --recursive --ini .bandit.ini bris
Test results:
>> Issue: [B614:pytorch_load_save] Use of unsafe PyTorch load or save
   Severity: Medium   Confidence: High
   CWE: CWE-502 (https://cwe.mitre.org/data/definitions/502.html)
   More Info: https://bandit.readthedocs.io/en/1.8.2/plugins/b614_pytorch_load_save.html
   Location: bris/callbacks.py:103:12
102	
103	            torch.save(model, inference_checkpoint_filepath)
@ways ways self-assigned this Jan 29, 2025
@ways
Copy link
Contributor Author

ways commented Jan 29, 2025

From 33cde6d

@ways
Copy link
Contributor Author

ways commented Jan 29, 2025

A safe alternative is to use torch.load with the safetensors library from hugingface, which provides a safe deserialization mechanism.

Is this something you consider important to fix, @tnipen ?

@ways
Copy link
Contributor Author

ways commented Jan 30, 2025

Waiting for #17 to ease testing

@ways ways assigned einrone and unassigned tnipen Feb 4, 2025
@ways ways linked a pull request Feb 4, 2025 that will close this issue
@ways ways added the security label Feb 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants