Do we need torchrun
for multigpu training?
#13192
Unanswered
jxchen01
asked this question in
DDP / multi-GPU / multi-node
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to combine pytorch lightning multigpu training with SmartCache Dataset in MONAI.
Here is a simple code to reproduce my practice.
When I run the code with
python monai_w_plt_multigpu.py
, I will get the pickable issue as https://pytorch-lightning.readthedocs.io/en/1.4.0/advanced/multi_gpu.html#make-models-pickleable.But, if I run the code with
torchrun monai_w_plt_multigpu.py
, everything just works fine, no error at all.So, do we always need
torchrun
to do multigpu training?basic info:
I am using
torch==1.11.0
andpytorch-lightning==1.6.3
Beta Was this translation helpful? Give feedback.
All reactions