How to preserve dataset order when using DDP? #15164
-
I need to be able to preserve the order in which the data is fed to the model when training in multiple GPUS. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
for that you'd need to write a custom DistributedSampler and pass it to the dataloader and set |
Beta Was this translation helpful? Give feedback.
for that you'd need to write a custom DistributedSampler and pass it to the dataloader and set
Trainer(replace_sampler_ddp=False)