Is there any way to cache the data when training with 'ddp' ? #12484
Unanswered
seungtaek94
asked this question in
DDP / multi-GPU / multi-node
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I try to cache the data for speed up training.
when I using only pytorch, I cache the data like below.
Its use share the data across the process using shared_dict.
https://github.com/ptrblck/pytorch_misc/blob/31ac50c415f16cf7fec277dbdba72b9fb4d732d3/shared_dict.py
Is there any way to cache the data in pytorch-lighting similar to above?
Beta Was this translation helpful? Give feedback.
All reactions