We should provide a signal when hands are doing a system gesture.
When the user is doing this, the experience should no longer try to detect gesture or draw target rays.
Currently these interfere on the Quest browser so a pinch gesture is detected when the user tries to make the gesture for the menu or oculus buttons.
Probably a read-only boolean on XRHand would be enough. Maybe call it inSystemGesture?
/agenda
We should provide a signal when hands are doing a system gesture.
When the user is doing this, the experience should no longer try to detect gesture or draw target rays.
Currently these interfere on the Quest browser so a pinch gesture is detected when the user tries to make the gesture for the menu or oculus buttons.
Probably a read-only boolean on
XRHandwould be enough. Maybe call itinSystemGesture?/agenda