Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve hand interaction profiles support #1562

Merged
merged 3 commits into from
Oct 8, 2024
Merged

Conversation

svillar
Copy link
Member

@svillar svillar commented Oct 7, 2024

This PR addresses 3 issues:

  1. improve gesture detection with hand interaction profiles. We were mixing pinch and squeeze events
  2. do not filter out events when ready action is false, that is not the meaning of ready state value
  3. Add support for XR_MSFT_hand_interaction profile, similar to the khronos one. Available in several devices/OS where the khronos one is not supported.

app/src/openxr/cpp/OpenXRInputSource.cpp Show resolved Hide resolved
app/src/openxr/cpp/OpenXRInputSource.cpp Outdated Show resolved Hide resolved
app/src/openxr/cpp/OpenXRInputSource.cpp Show resolved Hide resolved
Copy link
Member

@javifernandez javifernandez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have just tried the patch in the Quest3 and the beams projected from the hands are not correctly aligned.

@svillar
Copy link
Member Author

svillar commented Oct 7, 2024

I have just tried the patch in the Quest3 and the beams projected from the hands are not correctly aligned.

Yep, I've seen that as well, but we don't really control that. In other devices like Lynx-R1 the very same code works fine.

@javifernandez
Copy link
Member

I have just tried the patch in the Quest3 and the beams projected from the hands are not correctly aligned.

Yep, I've seen that as well, but we don't really control that. In other devices like Lynx-R1 the very same code works fine.

But we can't merge the patch as it is now, can we ? This makes handtracking in Meta devices to regress.

@svillar
Copy link
Member Author

svillar commented Oct 8, 2024

I have just tried the patch in the Quest3 and the beams projected from the hands are not correctly aligned.

Yep, I've seen that as well, but we don't really control that. In other devices like Lynx-R1 the very same code works fine.

But we can't merge the patch as it is now, can we ? This makes handtracking in Meta devices to regress.

In which sense do you think it's a regression? The origin of the beam is not as accurate indeed but I haven't seen any functional regression so far. Have you?

@javifernandez
Copy link
Member

I have just tried the patch in the Quest3 and the beams projected from the hands are not correctly aligned.

Yep, I've seen that as well, but we don't really control that. In other devices like Lynx-R1 the very same code works fine.

But we can't merge the patch as it is now, can we ? This makes handtracking in Meta devices to regress.

In which sense do you think it's a regression? The origin of the beam is not as accurate indeed but I haven't seen any functional regression so far. Have you?

It's a visual regression; it doesn't affect the functionality, indeed. That implies that the severity should be low, I agree.

Some actions are linked to specific hand gestures. By facing the
palm of each hand and pinching one could go back (left hand) or
exit Wolvic (right hand).

When using hand interaction profiles (XR_EXT_hand_interaction,
XR_MSFT_hand_interaction) we read the state of the actions
that are bound to the paths provided by the profiles. In our
case we get both trigger events (pinches) and squeeze events
(closing fist).

The key here is that those events are not mutually exclusive,
actually when pinching we're partially closing the fist. That
means that whend doing the pinch action we could receive
a 0.9 value for pinching but also a 0.5 value for squeezing.

The code was not discerning both cases properly when the hand
action was detected (palm facing head). That lead to situations
in which those pinches were not properly detected because when
iterating over the actions we were
1. detecting a pinch -> setting APP button as clicked
2. detecting a partial squeeze (not passing the threshold)
   -> setting APP button as released

That sequence was effectively not allowing pinch events to
be detected because the click status was cleared before
upper layers could detect those back/exit actions. The
consequence was that those hand gestures were not properly
detected. We had a workaround in place, which basically
was lowering the threshold but that was just a hack
that was not addressing the actual problem.
The XR_EXT_hand_interaction defines several "ready" actions for some of
their paths to inform the clients of the API that the tracking system
considers that the hand shape is observed ready to perform some
gesture (pinch, grasp...).

We were incorrectly ignoring the "value" of the associated path in
those cases. However that is wrong for some reasons:
1. "ready" is only true when the gesture is almost completed, this
means that we'd be ignoring all the values before the gesture is
detected. For example the "value" for pinch returns a float between
0.0 and 1.0 indicating how close to a pinch a given gesture is.
2. if "ready" is false, we were not updating the value of the
button, meaning that a trigger button will remain always in the
clicked state after the first pinch happened.

This was detected in Meta's OS v71 beta version.
@svillar svillar changed the title Add support for MSFT hand interaction profile Improve hand interaction profiles support Oct 8, 2024
Copy link
Collaborator

@felipeerias felipeerias left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testted on Quest 2 (v71) and Pico, no regressions. LGTM.

Copy link
Member

@javifernandez javifernandez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good

Support for the Khronos XR_EXT_HAND_INTERACTION extension was added
not so long ago. That extension defines a new interaction profile
that can be used to get the input from hands using the same OpenXR
concepts that are valid for physical controllers.

There is another vendor extension that does the same which is
XR_MSFT_HAND_INTERACTION which is supported in devices like Meta
devices (Quest2, Quest3, QuestPro) or Lynx-R1.

Even if they are 2 different extensions, the nice thing is that we
could use the same code path for both reducing the mess and complexity
of the hand tracking code. Not only that, as they both define
interaction profiles, they integrate nicely with the code that
handles physical controllers.

Although unlikely, it's possible that a given runtime provides both
the Khronos and MSFT extensions. In that case we select the
Khronos one.
@svillar svillar merged commit c8ea552 into main Oct 8, 2024
22 checks passed
@svillar svillar deleted the msft_hand_interaction branch October 8, 2024 16:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants