Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion tuning/sft_trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -516,11 +516,14 @@ def parse_arguments(parser, json_config=None):
fusedops_kernels_config,
attention_and_distributed_packing_config,
additional,
_,
leftover,
) = parser.parse_args_into_dataclasses(return_remaining_strings=True)

peft_method = additional.peft_method
exp_metadata = additional.exp_metadata
if leftover:
logging.error("Extra un-recognized arguments found: %s", leftover)
sys.exit(USER_ERROR_EXIT_CODE)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldn't be better to log and continue?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes - that is also ok. I would prefer immediate exit - since there are usually a wall of log output and this can get missed out - but we can go with whatever is the consensus.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the delay in reviewing this, there are two issues here -- to pass in accelerate launch arguments via JSON for kube deployments, the user must pass in a new field called accelerate_launch_args as described here. This is interpreted in the accelerate_launch.py script and then ignored in SFTTrainer. This change would break this behavior. I can see how being warned about misspellings or additional arguments is helpful. We can start this off as warning to not break current behavior or this PR can address the issue with accelerate_launch_args by removing them in the accelerate_launch.py script


if peft_method == "lora":
tune_config = lora_config
Expand Down
Loading