Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Oneshot refactor] Refactor initialize_model_from_path #1109

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

horheynm
Copy link
Collaborator

@horheynm horheynm commented Jan 28, 2025

ORDER OF REVIEWS:

  1. [Oneshot Refactor] Rename get_shared_processor_src to get_processor_name_from_model #1108
  2. [Oneshot Refactor] dataclass Arguments #1103
  3. [Oneshot refactor] Refactor initialize_model_from_path #1109 <- current PR
  4. [Oneshot Refactor] Main refactor #1110

SUMMARY:
Refactor initialize_model_from_path to decouple training_args dependent logic and oneshot (non-training_args) logic.

TEST PLAN:

  • Pass all tests
  • search initialize_model_from_path using grep

Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

@horheynm horheynm added the ready When a PR is ready for review label Jan 28, 2025
Copy link
Collaborator

@kylesayrs kylesayrs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please move set_seed outside of the initialize_model_from_path function. Otherwise lgtm.

@horheynm
Copy link
Collaborator Author

horheynm commented Feb 5, 2025

Please move set_seed outside of the initialize_model_from_path function. Otherwise lgtm.

Its a part of training args, https://github.com/huggingface/transformers/blob/main/src/transformers/training_args.py#L1044

and in the original code.
https://github.com/vllm-project/llm-compressor/pull/1109/files#diff-4b4df7cb1f6042ff11af61f87b5d6dffd8e07a7ef182a8aa659566a5ee77fc04L198

If there is an argument to move it outside, we can do it in a follow up pr.

dsikka added a commit that referenced this pull request Feb 5, 2025
…ame_from_model (#1108)

ORDER OF REVIEWS:
1. #1108 <- current
PR
2. #1103
3. #1109
4. #1110

SUMMARY:
* Rename `get_shared_processor_src` to `get_processor_from_model`
* Appropriate signature on `initialize_processor_from_path`, where
`teacher` should be optinal


TEST PLAN:
* Pass all existing tests
* Search `get_shared_processor_src` using pygrep
```bash
  3 function pygrep() {
  2     local search_term="$1"
  1     shift
126     local search_dirs="${*:-src examples tests}"                           
  1     grep -rn --include="*.py" -E "$search_term" $search_dirs
  2 }
```

---------

Co-authored-by: Dipika Sikka <[email protected]>
dsikka pushed a commit that referenced this pull request Feb 11, 2025
ORDER OF REVIEWS:
1. #1108
2. #1103 <- current
PR
3. #1109
4. #1110

SUMMARY:
Refactor dataclass used for llm-compressor entrypoints (oneshot, train,
apply) to decouple non-relevant attributes from the existing dataclass.
Ex. recipe in training_args. Recipe is contained in a session, not in
the trainer that training_args govern.

Dataclass refactor details are in

https://docs.google.com/document/d/1YbR1dTQmCzqhGk74m5msBzqoPHQgB6dVxDtf6cTmetc/edit?usp=sharing

Note: 
#1110 takes care of
using a new entrypoint that will prohibit the post_train / oneshot call
to use training_args. Current entrypoint will need training_args for
oneshot to function - this PR is just for refactoring the dataclass.


Before:
ModelArguments:
https://github.com/vllm-project/llm-compressor/blob/6fa5a5eecc7d363ec73474d011d40135b6374179/src/llmcompressor/transformers/finetune/model_args.py#L6
DataTrainingArguments:
https://github.com/vllm-project/llm-compressor/blob/6fa5a5eecc7d363ec73474d011d40135b6374179/src/llmcompressor/transformers/finetune/data/data_args.py#L70
TrainingArguments:
https://github.com/vllm-project/llm-compressor/blob/6fa5a5eecc7d363ec73474d011d40135b6374179/src/llmcompressor/transformers/finetune/training_args.py#L10

After:
ModelArguments:
https://github.com/vllm-project/llm-compressor/pull/1103/files#diff-58fd0f7ae4564317960ae0d4d4b2cdb97b9588c1915f062915e74ecf51b5502cR6
DatasetArguments:
https://github.com/vllm-project/llm-compressor/pull/1103/files#diff-5e43f74ba5d8327b937adada3c7f30a7efb13f9a44cb3fdb5e1a2a12b8b8ea27R70
RecipeArguments:
https://github.com/vllm-project/llm-compressor/pull/1103/files#diff-0ff9c048a4deb55e5459054bdc61a5d8c81da9c94588ec2355e6b2c2ec8675d1R6
TrainingArguments:
https://github.com/vllm-project/llm-compressor/pull/1103/files#diff-249ee96763dd50956a7309f898eda4bcaa91c6af653474568fbda10b5a39c817R12

TEST PLAN:
* Pass all existing tests
* Search dataclass arguments using `pygrep`
```bash
  3 function pygrep() {
  2     local search_term="$1"
  1     shift
126     local search_dirs="${*:-src examples tests}"                           
  1     grep -rn --include="*.py" -E "$search_term" $search_dirs
  2 }
```

---------

Co-authored-by: Rahul Tuli <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready When a PR is ready for review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants