We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MacOS Sonoma 14.7 Python 3.9.19
Running fine-tuning with transformers.
transformers
File "/opt/dataiku/code-env/lib/python3.9/site-packages/trl/trainer/sft_trainer.py", line 451, in train output = super().train(*args, **kwargs) File "/opt/dataiku/code-env/lib/python3.9/site-packages/transformers/trainer.py", line 1938, in train return inner_training_loop( File "/opt/dataiku/code-env/lib/python3.9/site-packages/accelerate/utils/memory.py", line 153, in decorator return function(batch_size, *args, **kwargs) File "/opt/dataiku/code-env/lib/python3.9/site-packages/transformers/trainer.py", line 2095, in _inner_training_loop model, self.optimizer = self.accelerator.prepare(self.model, self.optimizer) File "/opt/dataiku/code-env/lib/python3.9/site-packages/accelerate/accelerator.py", line 1326, in prepare result = tuple( File "/opt/dataiku/code-env/lib/python3.9/site-packages/accelerate/accelerator.py", line 1327, in <genexpr> self._prepare_one(obj, first_pass=True, device_placement=d) for obj, d in zip(args, device_placement) File "/opt/dataiku/code-env/lib/python3.9/site-packages/accelerate/accelerator.py", line 1202, in _prepare_one optimizer = self.prepare_optimizer(obj, device_placement=device_placement) File "/opt/dataiku/code-env/lib/python3.9/site-packages/accelerate/accelerator.py", line 2119, in prepare_optimizer optimizer = AcceleratedOptimizer(optimizer, device_placement=device_placement, scaler=self.scaler) File "/opt/dataiku/code-env/lib/python3.9/site-packages/accelerate/optimizer.py", line 75, in __init__ self.optimizer.load_state_dict(state_dict) File "/opt/dataiku/code-env/lib64/python3.9/site-packages/bitsandbytes/optim/optimizer.py", line 176, in load_state_dict if any(p_len != s_len for p_len, s_len in zip(param_lens, saved_lens, strict=True)): TypeError: zip() takes no keyword argument
No failure or a statement that Python 3.9 is no longer supported. Yank the release from Pypi as well.
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
System Info
MacOS Sonoma 14.7
Python 3.9.19
Reproduction
Running fine-tuning with
transformers
.Expected behavior
No failure or a statement that Python 3.9 is no longer supported. Yank the release from Pypi as well.
The text was updated successfully, but these errors were encountered: