How to do Hand Tracking

I am trying to test a project by building it with hand tracking instead of the controller. How can I go about doing this?

These are my current specs:

  1. Nreal Light (NR-9101GGL)
  2. NRSDK: 2.2.1
  3. Nebula: 3.8.0
  4. Unity version: 2022.3.21f1

I understand that in the NRInput I can change it to Hands, but what else do I need to do? I also added the left and right hands from the prefab but I tried to test it and I couldn’t see my hands when I loaded it into Nebula. Thanks to anyone that can help.

Hihi,

you may want to try the hand tracking demo for hand gesture interaction.

Hi, I have successfully managed to build and run the application. I can see the cube and menu options, however, when I hold a hand out, I do not see anything. Is there something I’m missing?
Screenshot 2024-07-09 123823

Try to play in the editor, and move your mouse while holding Shift. Can you see the simulated hands?
The settings I configured in HelloMR:

  1. Dragged the hands prefab into the scene.
  2. Set Input Source Type to Hands

Yes, I can see the hands when I play it in editor.
Edit: I have also tried to follow the tutorial here: Hand Tracking | NRSDK
And I did try to drag the prefab hands to the Left and Right, but that didn’t work either.


Screenshot 2024-07-09 144631

Theoretically, you could build the hand tracking scene without any change, and it should be runnable. Please try to discard all changes especially about the NRInput and build it again.

Ok, I’ll build another scene and get back to you.

Edit: I’ve tried to built another HelloMR and it loads fine but when I put both of my hands in front of the glasses I still cannot see anything. Is it not compatible on my version of the glasses?

Screenshot 2024-07-09 161231

Could you send me the APK so I can have a try?

Additionally, you might need to extend your hands a bit within the field of view (FOV) of the glasses to see them. Another possibility is whether you’ve minimized the HelloMR application. In that case, you might see the XREAL logo normally but not detect gestures.

I also submitted this modification to the previous repository. I don’t think it’s a problem with the glasses. If the glasses were broken, you might not even be able to perform plane detection in HelloMR.

Ok, will send you the APK and the link to the Google Drive:

https://drive.google.com/drive/folders/1JmjdEjbISUceBwLqXBhwNU7jXfaksy5U?usp=sharing

testHands4.apk is the build where I just put a simple HelloMR scene and added the NR hands prefab into the scene.

I can see the hands, your demo looks pretty normal.

Were you able to do hand tracking from your glasses? That’s the main problem on my side right now.

Yes. Could you perform another test? Connect your glasses to the phone, launch Nebula, and enter AR Space. You should see a small dragon in the center of the menu. Click the ‘play’ button to start hand tracking. Place your hands in front of the glasses; you should see a hand model with a semi-transparent material.

Hello, I managed to make it work. Thanks for the help!

1 Like