Model parallel (splitting layers vertically) #12320
Unanswered
Androsimus
asked this question in
DDP / multi-GPU / multi-node
Replies: 1 comment 3 replies
-
Hi @Androsimus, Lightning supports multiple strategies for model parallel training. There's an independent documentation page at https://pytorch-lightning.readthedocs.io/en/latest/advanced/model_parallel.html#model-parallel :) |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone!
Tell me please, is it possible to train model on many GPUs with some layers, that should be splitted vertically?
Are there best practices or advices for such tasks in Pytorch Lightning?
Beta Was this translation helpful? Give feedback.
All reactions