-
-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve hand interaction profiles support #1562
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have just tried the patch in the Quest3 and the beams projected from the hands are not correctly aligned.
Yep, I've seen that as well, but we don't really control that. In other devices like Lynx-R1 the very same code works fine. |
But we can't merge the patch as it is now, can we ? This makes handtracking in Meta devices to regress. |
In which sense do you think it's a regression? The origin of the beam is not as accurate indeed but I haven't seen any functional regression so far. Have you? |
It's a visual regression; it doesn't affect the functionality, indeed. That implies that the severity should be low, I agree. |
Some actions are linked to specific hand gestures. By facing the palm of each hand and pinching one could go back (left hand) or exit Wolvic (right hand). When using hand interaction profiles (XR_EXT_hand_interaction, XR_MSFT_hand_interaction) we read the state of the actions that are bound to the paths provided by the profiles. In our case we get both trigger events (pinches) and squeeze events (closing fist). The key here is that those events are not mutually exclusive, actually when pinching we're partially closing the fist. That means that whend doing the pinch action we could receive a 0.9 value for pinching but also a 0.5 value for squeezing. The code was not discerning both cases properly when the hand action was detected (palm facing head). That lead to situations in which those pinches were not properly detected because when iterating over the actions we were 1. detecting a pinch -> setting APP button as clicked 2. detecting a partial squeeze (not passing the threshold) -> setting APP button as released That sequence was effectively not allowing pinch events to be detected because the click status was cleared before upper layers could detect those back/exit actions. The consequence was that those hand gestures were not properly detected. We had a workaround in place, which basically was lowering the threshold but that was just a hack that was not addressing the actual problem.
7176e94
to
a16a7ff
Compare
The XR_EXT_hand_interaction defines several "ready" actions for some of their paths to inform the clients of the API that the tracking system considers that the hand shape is observed ready to perform some gesture (pinch, grasp...). We were incorrectly ignoring the "value" of the associated path in those cases. However that is wrong for some reasons: 1. "ready" is only true when the gesture is almost completed, this means that we'd be ignoring all the values before the gesture is detected. For example the "value" for pinch returns a float between 0.0 and 1.0 indicating how close to a pinch a given gesture is. 2. if "ready" is false, we were not updating the value of the button, meaning that a trigger button will remain always in the clicked state after the first pinch happened. This was detected in Meta's OS v71 beta version.
a16a7ff
to
ff5df61
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Testted on Quest 2 (v71) and Pico, no regressions. LGTM.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good
Support for the Khronos XR_EXT_HAND_INTERACTION extension was added not so long ago. That extension defines a new interaction profile that can be used to get the input from hands using the same OpenXR concepts that are valid for physical controllers. There is another vendor extension that does the same which is XR_MSFT_HAND_INTERACTION which is supported in devices like Meta devices (Quest2, Quest3, QuestPro) or Lynx-R1. Even if they are 2 different extensions, the nice thing is that we could use the same code path for both reducing the mess and complexity of the hand tracking code. Not only that, as they both define interaction profiles, they integrate nicely with the code that handles physical controllers. Although unlikely, it's possible that a given runtime provides both the Khronos and MSFT extensions. In that case we select the Khronos one.
ff5df61
to
e2b4185
Compare
This PR addresses 3 issues: