You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am using the torch backend with gcnn_keras but my gpu is not being used. If i called torch.cuda.isavaliable() I get true but I'm unsure what I need to do on the keras side to make this work any help would be appreciated.
The text was updated successfully, but these errors were encountered:
Hello, last time I checked gpu acceleration worked with kgcnn and torch.
A few things to check:
Does your pytorch installation works on GPU apart from keras_core.
Could you run a simple non-linear dense layer with keras_core on GPU and torch.
If all above works, then I would be interested what script are you trying to run on gpu.
I thought that all backend torch functions are supported by torch gpu but maybe I added something that isn't.
I would then have to start searching...
Note that with torch at the moment jagged input is not yet supported.
Hi, I am using the torch backend with gcnn_keras but my gpu is not being used. If i called
torch.cuda.isavaliable()
I get true but I'm unsure what I need to do on the keras side to make this work any help would be appreciated.The text was updated successfully, but these errors were encountered: