Skip to content

Setting random seed before training #9268

Discussion options

You must be logged in to vote

No, parameters get broadcast on the first time the model performs forward. This is implemented in the PyTorch DistributedDataParallel wrapper and so applies to Lightning as well.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by w2kun
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants