Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Callbacks] Remove pre_initialize_structure #1160

Merged
merged 29 commits into from
Feb 26, 2025
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
9b3e216
remove pre_initialize_structure
kylesayrs Feb 17, 2025
3bff7d1
remove preinit event
kylesayrs Feb 17, 2025
14e47a5
remove order test
kylesayrs Feb 17, 2025
c6a9e6b
Merge branch 'main' into kylesayrs/remove-preinitialize-structure
kylesayrs Feb 17, 2025
6b882bb
consolodate saving
kylesayrs Feb 18, 2025
bb35a74
typos
kylesayrs Feb 18, 2025
71903ff
add todos
kylesayrs Feb 18, 2025
d39d375
dreggs, style
kylesayrs Feb 18, 2025
7cc5a6d
typo
kylesayrs Feb 18, 2025
9865fa3
adjust typehint
kylesayrs Feb 18, 2025
68ce624
allow prepending
kylesayrs Feb 18, 2025
5b7cc03
check saved recipe contents
kylesayrs Feb 18, 2025
bdc4fa5
consolidate saving paths
kylesayrs Feb 18, 2025
a83b0aa
remove broken import
kylesayrs Feb 18, 2025
4efd116
Merge remote-tracking branch 'origin' into kylesayrs/consolidate-saving
kylesayrs Feb 18, 2025
b9f0bd1
add back def
kylesayrs Feb 18, 2025
29ab794
Merge remote-tracking branch 'origin' into kylesayrs/remove-preinitia…
kylesayrs Feb 18, 2025
0a2642b
save state
kylesayrs Feb 18, 2025
0c70881
Merge branch 'kylesayrs/consolidate-saving' into kylesayrs/remove-pre…
kylesayrs Feb 18, 2025
60371ef
remove verbose messages
kylesayrs Feb 18, 2025
bf9a8cd
fix double initialization
kylesayrs Feb 18, 2025
3d64d57
rename function
kylesayrs Feb 25, 2025
55984c4
remove accidentally added files
kylesayrs Feb 25, 2025
05fa5f6
Merge remote-tracking branch 'origin' into kylesayrs/remove-preinitia…
kylesayrs Feb 25, 2025
53d762e
Merge remote-tracking branch 'origin' into kylesayrs/remove-preinitia…
kylesayrs Feb 25, 2025
e026658
add debug statement
kylesayrs Feb 25, 2025
00961a0
pass model to stage runner
kylesayrs Feb 25, 2025
c7678b0
remove breakpoint
kylesayrs Feb 26, 2025
049e3cc
Merge branch 'main' into kylesayrs/remove-preinitialize-structure
dsikka Feb 26, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions src/llmcompressor/entrypoints/oneshot.py
Original file line number Diff line number Diff line change
Expand Up @@ -254,6 +254,7 @@ def _pre_process(self):
if isinstance(self.model_args.model, (str, PosixPath)):
self.model_args.model, _ = initialize_model_from_path(self.model_args)

breakpoint()
patch_tied_tensors_bug(self.model_args.model)
modify_save_pretrained(self.model_args.model)

Expand Down
6 changes: 4 additions & 2 deletions src/llmcompressor/transformers/finetune/runner.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
import torch
from loguru import logger
from torch.utils.data import Dataset
from transformers import PreTrainedModel

from llmcompressor.args import (
DatasetArguments,
Expand Down Expand Up @@ -154,7 +155,9 @@ def train(self, checkpoint: str, stage: Optional[str] = None):
# this includes saving the state, optimizer and scheduler
self.trainer.save_model(output_dir=self._output_dir)

def run_sequential_stages(self, checkpoint: Optional[str] = None):
def run_sequential_stages(
self, model: PreTrainedModel, checkpoint: Optional[str] = None
):
"""
Run the recipe stage by stage, allowing for alternating between one-shot and
finetuning flows. Optionally save the model output at the end of each stage
Expand Down Expand Up @@ -195,7 +198,6 @@ def run_sequential_stages(self, checkpoint: Optional[str] = None):
if run_type is StageRunType.ONESHOT:
from llmcompressor import Oneshot

model = get_session_model()
self._model_args.model = model

oneshot = Oneshot.from_args(
Expand Down
2 changes: 1 addition & 1 deletion src/llmcompressor/transformers/finetune/text_generation.py
Original file line number Diff line number Diff line change
Expand Up @@ -422,7 +422,7 @@ def main(
checkpoint = None
if last_checkpoint is not None:
checkpoint = last_checkpoint
stage_runner.run_sequential_stages(checkpoint)
stage_runner.run_sequential_stages(model, checkpoint)

# exit immediately
return
Expand Down
Loading