Additionally check number of devices of Trainer for evaluation before warning when using DistributedSampler
#14068
Unanswered
function2-llx
asked this question in
DDP / multi-GPU / multi-node
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
When using
DistributedSampler
(e.g., when usingDDPStrategy
) during validation/test, a warning will suggest settingTrainer(devices=1)
even if it already is. How about additionally checking if the number of devices ofTrainer
is 1 here?https://github.com/Lightning-AI/lightning/blob/d8e5e7f889646e2ae3f480941b4c9e18434e994d/src/pytorch_lightning/trainer/connectors/data_connector.py#L301
Beta Was this translation helpful? Give feedback.
All reactions