With the hand controllers can I make them click UI items, such as toggle or checkbox, with the fingertips?

Hi

I am slowly building a new NReal app in Unity an hour here and there after my day job (I work in Java, C#, MDX and cube reporting in inv banking by day since 1998!!! :)).

I’ve got my splash screens working and I am currently assembling a UI. This is going well. I added a slider to act as a binary toggle, 2 options so 0-1, and I have laser | hands.

By default the laser pointer works, and I got it to be able to click the toggle and on change it flicks to hands…and they appear OK.

So far so good. :slight_smile:

I cannot get it to work so I can just press the slider with a fingertip. I tried adding a rigidBody2D around my toggle, 2dcollider with isTrigger enabled, and an OnTriggerEnter2D function.

This didn’t work. I also tried a BoxCollider around it but this didn’t work. Do I need to add a collider around the hand?

Am I overthinking it? :slight_smile: I am learning Unity as I go too. :slight_smile:

Code attached to the toggle/slider:

using System;
using System.Collections;
using UnityEngine;
using UnityEngine.UI;
using NRKernal;
using UnityEngine.EventSystems;// Required when using Event data.
using TMPro;

public class HandLaserSwitch : MonoBehaviour, IPointerClickHandler, IPointerDownHandler

{
    public Slider laserHandsSlider;
    public TMPro.TextMeshProUGUI debugText;

    public GameObject rightHandModel;
    public GameObject leftHandModel;

    /// <param name="eventData"> Current event data.</param>
    public void OnPointerClick(PointerEventData eventData)
    {
        HandleSliderClick();
    }

    /// <param name="eventData"> Current event data.</param>
    public void OnPointerDown(PointerEventData eventData)
    {
        HandleSliderClick();
    }

    public void StartHandTracking()
    {
        NRInput.SetInputSource(InputSourceEnum.Hands);
        NRInput.RaycastMode = RaycastModeEnum.Gaze;
        NRInput.RaycastersActive = true;

        SetHandsActive(true);
    }

    public void StartLaserController()
    {
        SetHandsActive(false);
        NRInput.SetInputSource(InputSourceEnum.Controller);
        NRInput.RaycastMode = RaycastModeEnum.Laser;
        NRInput.RaycastersActive = true;
    }

    public void SetHandsActive(bool isActive)
    {
        if (leftHandModel)
        {
            leftHandModel.SetActive(isActive);
        }
        if (rightHandModel)
        {
            rightHandModel.SetActive(isActive);
        }
    }  

    /*public void ToggleRaycastMode()
    {
        Debug.Log("HandTrackingExample: ToggleRaycastMode");
        NRInput.RaycastMode = NRInput.RaycastMode == RaycastModeEnum.Gaze ? RaycastModeEnum.Laser : RaycastModeEnum.Gaze;
    }

    public void SwitchHandVisual()
    {
        Debug.Log("HandTrackingExample: SwitchHandVisual");
        //handModelsManager.ToggleHandModelsGroup();
    }*/

    public void HandleSliderClick()
    {
        if(laserHandsSlider!=null)
        {
            int laserHandState = (int)laserHandsSlider.normalizedValue;
            debugText.text = "Slider val: " + laserHandState;

            switch(laserHandState)
            {
                case 0:
                    {
                        StartLaserController();
                        break;
                    }
                case 1:
                    {
                        StartHandTracking();
                        break;
                    }
            }

        }
    }

    private void OnTriggerEnter(Collider other)
    {
        //Mmm..doesn't seem to work.
        debugText.text = "Slider val: " + 3;
    }
}
1 Like

Yes, unity physics requires that both sides (hand and button) have a collider, and at least one of them have a rigid body for a collision to be detected :+1:

And if you feel brave enough :wink: you can check what’s directly touching a joint by getting the joint Pose (had both position and rotation) when you perform a pinch gesture, using this:

HandState handState = NRInput.Hands.GetHandState(HandEnum.RightHand);
Pose thumbTipPose = handState.GetJointPose(HandJointID.ThumbTip)

OK great thanks for your prompt reply. :slight_smile:

I shall investigate haha

Thanks

Leigh

Hey

I got some time to play around.

I actually found this:

but it still does not seem to work.

I read that of these 3 functions OnTrigger is only used if I ticked true for isTrigger:

I then tried to run the app locally, on my dev PC, and i saw a red error at runtime:

MissingComponentException: There is no ‘Collider’ attached to the “NRHand_R” game object, but a script is trying to access it.
You probably need to add a Collider to the game object “NRHand_R”. Or your script needs to check if the component is attached before using it.
NRKernal.NRExamples.HandJointColliderEntity.Update () (at Assets/NRSDK/Demos/HandTracking/Scripts/HandJointColliderEntity.cs:31)

So if Entity script I found needs a collider…but which type?

1 Like

Any type actually would work. All colliders inherite from the base collider, so they would all work.

Ok

Thanks will check later. It was also a quick lesson in the difference between 2D and 3D colliders and as the hands are 3D I’ll go for BoxCollider on both as I read somewhere that 2D and 3D colliders/physics don’t interact/get calculated together.

This video was also an interesting watch too regarding the Sleep Threshold setting: