Skip to content

Should I configure FP16, optimizers, batch_size in DeepSpeed config of Pytorch-Lightning? #12465

Discussion options

You must be logged in to vote

yes, you don't need to set them inside config since this is done by Lightning already here if you set them in trainer and lightning module: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/strategies/deepspeed.py

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@ShaneTian
Comment options

Answer selected by ShaneTian
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment