SigLIP memory usage #1017
-
I am currently running with the following: python -m open_clip_train.main Images should be 224x224 but I can't use a batch size above 200ish without getting OOM on a 24gb VRAM machine. How did the original authors run 32k? that would require 3840gb of memory wouldn't it? They claimed to run it on 4 TPUv4s which is only 128gb. They write "For ex-ample, with four TPU-v4 chips, we could fit a batch size of 4096". That's over 5 times the batch size that I can fit. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Nevermind I needed |
Beta Was this translation helpful? Give feedback.
Nevermind I needed
--grad-checkpointing