-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fixing reproducibility of lmeval tests #1220
Conversation
👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review. Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed. |
87dcc4c
to
1f2ce00
Compare
3429cc7
to
b91cd7b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
GG!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rebase?
2d2e220
b91cd7b
to
2d2e220
Compare
SUMMARY: Fixed logging and clear loggers enabling/disabling bug. Previously, any value on the right environment variables would disable logging. Now, we explicitly check for `true` TEST PLAN: Added unit tests for enabling logging. `make test` passes --------- Signed-off-by: Aman Gupta <[email protected]> Co-authored-by: Dipika Sikka <[email protected]> Signed-off-by: Brian Dellabetta <[email protected]>
…ot (#1212) Order of reviews: #1206 #1207 #1209 #1212 <-- Here #1214 SUMMARY: * Move the preprocessing and postprocessing logic out of `src/llmcompressor/transformers/finetune/text_generation.py` and into `src/llmcompressor/entrypoints/utils.py` TEST PLAN: Pass tests Signed-off-by: Brian Dellabetta <[email protected]>
SUMMARY: Current README shows which algo we support + how to run. However, to a user it is still hard to understand when to use which. Add more info on based on the users use-case and hardware the optimization to apply. TEST PLAN: N/A Signed-off-by: Brian Dellabetta <[email protected]>
## Purpose ## * Simplify the modifier lifecycle by removing the ability for modifiers to affect the model after the modifier's `end` event * This allows the `on_event` method to be removed in a future change ## Background ## * The `leave_enabled` option was originally intended as a shortcut to simplify recipes which used magnitude pruning during the iterative pruning, then needed the masks to stay enabled during stabilization SFT * This change proposes making the recipe clearer by requiring the ConstantPruningModifier after the MagnitudePruningModifier becomes inactive ## Changes ## * Remove `MagnitudePruningModifier.leave_enabled` with a deprecation warning Signed-off-by: Kyle Sayers <[email protected]> Signed-off-by: Brian Dellabetta <[email protected]>
Signed-off-by: Brian Dellabetta <[email protected]>
Signed-off-by: Brian Dellabetta <[email protected]>
Signed-off-by: Brian Dellabetta <[email protected]>
Signed-off-by: Brian Dellabetta <[email protected]>
Co-authored-by: Kyle Sayers <[email protected]> Signed-off-by: Brian Dellabetta <[email protected]>
## Purpose ## * Fix tests which break as a result of https://github.com/vllm-project/llm-compressor/pull/new/kylesayrs/replace-self-hosted ## Changes ## * Use self-hosted model stub Signed-off-by: Kyle Sayers <[email protected]> Signed-off-by: Brian Dellabetta <[email protected]>
fffabf0
to
341299f
Compare
SUMMARY:
LM Eval weekly tests are failing, this resolves two issues
TEST PLAN:
no new src code