-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUDA out of memory on multiple GPUs #40
Comments
Hi @zxccade , Could you please provide the script you are using. |
Hi there, Here is the script I used. I tried to infer the model from long videos with multiple GPUs.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi author,
Thanks for your great work on Sa2VA!
I got an issue when trying to infer on multiple GPUs using the script you provided at Sa2VA-26B
I found only the first GPU was used for inference when I got CUDA out of memory problem and other GPUs still had spare memory.
Could you help me address this issue?
Best,
The text was updated successfully, but these errors were encountered: