How to use rams of both gpu at the same time in one source? #15680
Unanswered
kobrafarshidi
asked this question in
DDP / multi-GPU / multi-node
Replies: 1 comment
-
I send my source if it could be better show my error This is data module
and this is part of model
and this is for trainer
Now I fit my model
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I have a source writed with pytorch_lightning and I need to run it on two GPU. I searched and the only advice from anyone was to add in my source
pl.trainer(gpus=2)
but my source still run only just one gpu and can't work on both of them.it is very necessary for me because my program can't run on only one of them.
Beta Was this translation helpful? Give feedback.
All reactions