-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BOUNTY - $300] Support Multi-GPU #223
Comments
I agree with supporting one exo instance that uses multiple GPUs. |
Chiming in as a new user with a multi-GPU setup. One instance is easiest. Users can simply control GPU selection with the CUDA_VISIBLE_DEVICES environment variable. |
i was able to do this, I forked to github and added configurations to the integration.py and integration_engine.py |
the only issue is trying to have it show up on the exo console page to show 2 gpus instead of one, still testing. |
Multiple instances per device, assign a gpu to each. (approach 1)pros:
cons:
single instance per deivce, assign multiple gpus (approach 2):aspects:
pros:
cons:
Case 1: Case 2: |
I implemented a temporary workaround using approach 2 in #656. |
I suppose this isn't a full solution for multi-gpu, it's just a wrapper on VISIBLE_DEVICES. |
Currently you can only run one exo instance on each device.
There are some design decisions here:
The text was updated successfully, but these errors were encountered: