How to do Hand Tracking

I am trying to test a project by building it with hand tracking instead of the controller. How can I go about doing this?

These are my current specs:

  1. Nreal Light (NR-9101GGL)
  2. NRSDK: 2.2.1
  3. Nebula: 3.8.0
  4. Unity version: 2022.3.21f1

I understand that in the NRInput I can change it to Hands, but what else do I need to do? I also added the left and right hands from the prefab but I tried to test it and I couldn’t see my hands when I loaded it into Nebula. Thanks to anyone that can help.

Hihi,

you may want to try the hand tracking demo for hand gesture interaction.

Hi, I have successfully managed to build and run the application. I can see the cube and menu options, however, when I hold a hand out, I do not see anything. Is there something I’m missing?
Screenshot 2024-07-09 123823

Try to play in the editor, and move your mouse while holding Shift. Can you see the simulated hands?
The settings I configured in HelloMR:

  1. Dragged the hands prefab into the scene.
  2. Set Input Source Type to Hands

Yes, I can see the hands when I play it in editor.
Edit: I have also tried to follow the tutorial here: Hand Tracking | NRSDK
And I did try to drag the prefab hands to the Left and Right, but that didn’t work either.


Screenshot 2024-07-09 144631

Theoretically, you could build the hand tracking scene without any change, and it should be runnable. Please try to discard all changes especially about the NRInput and build it again.

Ok, I’ll build another scene and get back to you.

Edit: I’ve tried to built another HelloMR and it loads fine but when I put both of my hands in front of the glasses I still cannot see anything. Is it not compatible on my version of the glasses?

Screenshot 2024-07-09 161231

Could you send me the APK so I can have a try?

Additionally, you might need to extend your hands a bit within the field of view (FOV) of the glasses to see them. Another possibility is whether you’ve minimized the HelloMR application. In that case, you might see the XREAL logo normally but not detect gestures.

I also submitted this modification to the previous repository. I don’t think it’s a problem with the glasses. If the glasses were broken, you might not even be able to perform plane detection in HelloMR.

Ok, will send you the APK and the link to the Google Drive:

https://drive.google.com/drive/folders/1JmjdEjbISUceBwLqXBhwNU7jXfaksy5U?usp=sharing

testHands4.apk is the build where I just put a simple HelloMR scene and added the NR hands prefab into the scene.

I can see the hands, your demo looks pretty normal.

Were you able to do hand tracking from your glasses? That’s the main problem on my side right now.

Yes. Could you perform another test? Connect your glasses to the phone, launch Nebula, and enter AR Space. You should see a small dragon in the center of the menu. Click the ‘play’ button to start hand tracking. Place your hands in front of the glasses; you should see a hand model with a semi-transparent material.

Hello, I managed to make it work. Thanks for the help!

1 Like

@survivalqueen27 What did you do to get it to work?

@Dorix I’m currently having issues getting hand tracking to work on my Ultra, NRSDK 2.2.1, and Unity 2022.3.21 using a Samsung Fold 5 (same processor as S23)

I’m able to build and run all the demos except for hand tracking.
I built an initial hand tracking app using the documentation, and got it to work for one day. It was a brand new scene, NRcamerarig, NRinput switched to hands, and prefab NR left and right hands inserted. When it was working, I could see line and sphere overlay for hand tracking. But when I added a cube into the scene, everything seems to break.

The app will start, with the Unity logo, and then the touchpad screen shows up, but then it will go back to the main menu.

  • This crash happens with my app, the demo app, and also the Testapp4.apk that Survivalqueen27 posted a few days ago.
  • I’ve also tried building using the repo Unity project, and that doesn’t help.
  • I’ve tried re-installing Nebula also, and same broken result.
    -In the app I built, when I rebuild by changing NRinput from hands back to controller, then the app will run properly and doesn’t crash, but of course, there is no hand tracking
  • When I run the dragon demo in Nebula on the same phone, the hand tracking works perfectly every time, including now.

I don’t really understand why SDK hand tracking refuses to work for me anymore after working for one day.
I don’t think I changed anything, except for adding a cube to the scene.
Are there other settings I should look at?
-I have minimum API set to 10 and target set to 13… I have APK 33 and 34 installed from the Android SDK.
-I also tried modifying the androidmanifest file as suggested in the documentation.
Maybe there are settings on the phone itself I need to look for?

Regarding hand tracking, we have been communicating with other developers recently and, including your situation, have been trying to replicate the issue. We have identified an intermittent bug, and our engineers are actively investigating the root cause. We apologize for the poor experience this has caused.

From your description, since the dragon demo in Nebula works perfectly, it indicates that your hardware is not the issue. It is likely a bug in the SDK. Based on my personal experience, you could try deleting the Library folder in your Unity Project and then reopening the project to build the APK again. However, please note that this may not necessarily resolve the problem.

@Dorix Would you mind trying an APK I built for hand tracking? It is simply a starting blank scene with NRcamerarig, NRinput set to hands with prefab Lhand and Rhand, and a cube in the scene.

I will try clearing the library folder, but before I do that, is there any other information I can collect to help?
Some other information I have, my minimum API is set to 29, and target is highest installed where in AndroidStudio I believe I installed 33 and 34.
Also my phone is using ONE UI 6.1

I will test the APK you built and will get back to you with the results as soon as possible.

It looks like your environment settings are all correct. Additionally, you might want to try building the Hand Tracking demo scene directly instead of creating one from scratch. See if this approach works without issues. (Although, theoretically, they should be the same, I’ve found in my recent tests that the Hand Tracking demo seems to have fewer issues.)

Edited:
It works perfectly on my samsung note 20, I can see that you placed a gray cube in front of the scene and the hands composed of spheres and sticks. I’m using Nebula 3.8.0.

Hmm… I can see if I can find a note 20 and maybe another newer Snapdragon device to try. I’m also using Nebula 3.8.0 and also tried reinstalling it.

Ok, I was able to successfully run the HandTrackingTest.apk on my S20 FE phone (Snapdragon 865, ONE UI 5.1, Android 13). This hand experience feels a little slower compared to the Fold 5 when the hand tracking app worked the first time.

So I think what I will try is building to different target APIs for the Fold 5 to see if that does something. Otherwise I’m not sure what else to try.

Just to add:
When the app starts on my Fold 5, the Unity logo shows, then a different touchpad display shows on the phone screen (with home and app buttons), but in my glasses view, for just a second also shows the touchpad but it is split in half with two sides on left and right, and then the app exits by itself back to the home screen.

Also, I don’t know if this matters, but on my S20, when I plug in the Ultra, I get the dialog box that says “Allow Nebula to access XREAL Air 2 Ultra”. On my Fold 5, this dialog box does not appear (maybe I checked the “Always open…” button before)