You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi!
Can you explain why GAST-Net's dimension is (25, 17, 256) (T, N, C) after executing first Graph Attention Block in the paper?
I think input shape (256, 27, 17) will be (256, 9, 17) after going through Conv2D with kernel (3,1) and stride (3, 1) in gast_net.py
When I print the residual shape in the network, it shows me (256, 9, 17) # (C, T, N).
Thanks
The text was updated successfully, but these errors were encountered:
Hi!
Can you explain why GAST-Net's dimension is (25, 17, 256) (T, N, C) after executing first Graph Attention Block in the paper?
I think input shape (256, 27, 17) will be (256, 9, 17) after going through Conv2D with kernel (3,1) and stride (3, 1) in gast_net.py
When I print the residual shape in the network, it shows me (256, 9, 17) # (C, T, N).
Thanks
The text was updated successfully, but these errors were encountered: