feat(wrapper): add gpu support in containers#254
feat(wrapper): add gpu support in containers#254Fastiraz wants to merge 1 commit intoThePorgs:devfrom
Conversation
| torch~=2.6.0 | ||
| numpy~=2.2.3 |
There was a problem hiding this comment.
Can you check for GPU without a installing a 1Go torch dependency ?
There was a problem hiding this comment.
Yes but only with some hacks.
Instead of checking properly with a Python module, I can just execute nvidia-smi on a subprocess and check if there is any standard output or standard error.
If there is a standard output, that mean NVIDIA drivers are installed.
Otherwise, if there is a standard error, that mean NVIDIA drivers are not installed.
For macOS, I just have to check to platform and the CPU architecture.
If the platform is macOS and the CPU architecture is ARM, we determine that the "GPU" is MPS.
I've already a version like that and it works well on Linux.
Keep in mind that I cannot test it with AMD GPUs and Linux or Windows operating systems.
If you think a version like this is better, I can make a new commit with the new function.
Here's the new function:
def isGPUAvailable(self) -> Optional[str]:
import platform, subprocess
from typing import Optional
system = platform.system().lower()
try:
gpu_names = subprocess.run(
['nvidia-smi'],
stdout=subprocess.PIPE,
stderr=subprocess.DEVNULL,
check=True
).stdout.decode().strip().split('\n')
return '"device=0"' if len(gpu_names) == 1 else '"all"'
except (subprocess.CalledProcessError, FileNotFoundError):
pass
if system == "darwin" and platform.machine().startswith("arm"):
return None
return None|
Oehhh i would like it!!! |
|
I am also interested in the merge of this PR |
|
Could be usefull for some gpu calculations, hashcat for example. |
Description
Add GPU support using the
torchPython module and thedevice_requests(--gpusin CLI) Docker argument.To do this I implemented the
isGPUAvailablefunction which checks for GPU availability and returns the appropriate value for Docker's--gpusargument. Finally, I addeddocker_args["device_requests"]to enable GPU support when creating a container.Point of attention
I've also imported the
numpyPython module but it's not required.I imported it because of a warning message that appears when creating a new container with Exegol.
It works fine even without
numpy, the error is just displayed when the module is missing.For now, I haven't figured out how to solve this properly.
I found a way to suppress the warning using warning filters, so it doesn't print the message. Let me know if you want me to implement it this way.
Here's an example: