How do you invoke hand tracking when using the Air2 Ultra

Having trouble setting hand tracking function on boot-up when using the Beam or Samsung S24. What is the best way to toggle off and on?

1 Like

According to Mr VR’s youtube video it’s not a part of nebula itself - you need to run a MR app that supports it. Mr VR XREAL Air 2 Ultra Review

From what I can tell, Nebula 3.8.0 for Android has support to run such MR apps. I have a Fold 5 (KR) but of the 3 apps I see on there (Maze, QB Planets, Pup Pup) none of them have hand tracking controls listed in their descriptions. I’m not sure if you need to install it from within nebula or if I can install it from the Play Store in android, but when I tried the latter, it just kept crashing on launch.

I haven’t unboxed my Beam Pro yet since I’m still waiting on a screen protector to try it out on there.

I thought hand tracking was to replace the pointer on the Beam Pro.

On the Apple Vision Pro, hand tracking is the main pointer for all apps on the device.

Currently, hand gestures is not the primary interaction method in our Nebula system for various reasons. However, as @EyhSharkle mentioned, developers can use hand tracking to develop applications. Some apps in our AR Lab support hand gestures, like appAARatus, which you can download and try out.

Additionally, only the Ultra and an older model, the Light, support hand tracking. Other Air series glasses do not support hand tracking because they lack cameras.

Devices like the Apple Vision Pro or Quest have multiple sensors and cameras to support their hand-tracking algorithms. Our devices have only two cameras, which poses a significant challenge for the algorithms.

I’ve seen on Reddit from a few users that hand tracking was apparently supported in the Beam Pro UI with the Ultras up until an update last week. Is this true? And if so, are there plans to bring it back or is it just axed for good? Seems like a waste to not support it in the UI.

Yes, it’s true. After an internal discussion, the product team decided that the current hand tracking interaction wasn’t quite ideal or mature enough for consistent use. They’re planning to refine and optimize the feature further before releasing it again. We definitely see the potential and agree it would be a valuable addition to the UI, so stay tuned for updates!

When will this come back as a feature? I actually was getting good use out of it until I updated and was left without the feature. Is it possible the feature can be left available to users while you work on perfecting it on the back end? It was a beta feature, and am willing to accept whatever bugs come my way.

Is there another way we can enable it on our Ultra’s?

Definitely liked. Hope returns soon.

1 Like

@Chadason @spacecougar I’ll share this with the product team and let you know as soon as we have any updates. Thanks for your feedback!

1 Like

I agree. I paid 700.00 for that feature. Could have got the pro for less without.

Did you happen to get any feedback? I love the glasses, but I want to take advantage of all of it’s capabikities, even if it’s buggy!

No reply yet. Also just took them on an airplane trip and when listing to a movie on Max with the Beam Pro, very hard to hear and had volume at max. Engine noise drowns out the sound and no way to raise. My Samsung phone was a little better, but still had to read lips. Would be a good idea for Xreal to re-engineer the speakers to side rather that bottom and allow sound via the bone vibration. Will not disturb anyone around you and would concentrate the sound into the ear canal.

They told me that hand tracking is continuously being optimized and will be reintroduced when it’s ready. However, they didn’t provide an exact timeline, so we’ll just have to wait and see.

2 Likes

Sad :pensive:. I wish they at least left a hidden setting somewhere to re-enable it even if it’s buggy.

1 Like

Agreed. I had used it as my main option for movement within the AR space…