Onscreen Button in Unity / NRSDK

Hello there, I am building a small game in Unity with NRSDK, everything is running fine but I seem not to be able to add a functional onscreen button to the canvas inside “NRVirtualdisplayer”.

Please see the attached Screenshot. The button (white) is clickable in both Simulator and on the Android device itself, and it is mapped to simulate a “b” press on the keyboard via the standard on-screen button script from onscreen controls package.

Pressing b on the keyboard works as intended, but pressing the onscreen button does nothing. “b” is just mapped to Jump input in input manager of the project, and as said, by pressing b on the keyboard works as intended.

Do I oversee something? Shouldn’t clicking the onscreen button register as a keyboard “b” press?

Best!

Hi, developer, I wonder if there’s any effect when you click the onscreen button? In other words, Is the button on the screen triggered to click? If not, did you check the box Raycast Target on the screenshot?

Hello Doris,

yes the button registeres a click / touch, in simulator and on the phone.

Raycast Target is enabled. Do you have any other ideas?

Try to add CanvasRaycastTarget script(or something like that) to the canvas

I‘ve tested the simple button on the virtual displayer, and it worked. So probably you have a problem with the logic of your code. You can remove the “PressBscript” and print logs to test whether the virtual button is useful.



Thank you very much, I am on a different project this week, but will be back on the Unity project next week. I am keen to solve this, and suspect something is wrong between old and new input system im Unity. The project actually has enabled both, and I am not sure a binding of the key „b“ with the new systems on-screen button script does execute the key „b“ associated in the old system.