Pre-purchase Questions

Hi, I have not yet purchased any glasses as I am still researching. But I am looking to port an app from the Meta Quest 3. Which pair of glasses are best? It is a full mixed reality experience. I was originally thinking of getting the latest Air 2 Ultra but I would need to capture first person experiences in colour and thought I saw somewhere that it is only b&w. Should I get the Air 2 pro then?

Was also wondering if it’s possible to develop in Unity on PC and test without an android device. I understand that best experience is to have the device in pocket but for now just looking to test the device on windows dev env with Unity. Is this possible or does it need to be deployed to a phone device to test?

Hi, the first-person experience that you mentioned is casting the images in the glass to a PC? If so, then Ultra can also meet the requirement.
We do have an emulator in the SDK, but not all the functions can be tested. So we recommend the developers to deploy the developed app to a compatible Android phone for tests.

Thanks for your reply. So there were really 2 questions and I think you answered both but just want to be sure.

I want to develop and test on with the glasses plugged into a PC without needing an android device. Is that possible? I realise that it best on android for portability. But want to initially test directly on the glasses with windows and Unity. I don’t have an android (yet) I want to test glasses / hardware on PC. Does the emulator work directly with the glasses or does in just show on the PC monitor ?

The second part is I want to be able to show users what the experience looks like (in colour). Am I able to capture the experience from the Ultra glasses? Is there a colour camera on the Ultra?

Hi. About your first question, the Unity emulator I mentioned is used to simulate the scene on the PC screen without connecting the glass to the PC directly.
The experience you mentioned is similar to the first person view function in our SDK, which allows you to stream what you see in the glass to your PC for others to view on the PC screen. You can find more information about this function in our SDK at: First Person View | v2.2.0 | NRSDK.

Okay so it seems like I can’t test on the glasses without an android device - which doesn’t really work for me in my case. Not much point getting the glasses until I have an android device then. I wanted to test first before investing in an android device.

How about capturing in colour from the Air Ultra 2 - is that possible?

Please confirm if you want to capture both virtual and physical objects through the camera on the glass, or if you only want to display virtual content in the glass to others. The Ultra is equipped with two CV cams that are not accessible for capturing videos or pictures. However, if you only want to showcase the content in the glass to others, our SDK’s first person view function can fulfill this requirement. This function duplicates the content (the content is what you configure in Unity project, no matter colorful or black&white) in the glass and then transmits it to a PC via LAN.

I need to be able to capture the entire experience so that I am able to show this around on social media and in-store. Makes it very difficult to promote without this feature. This is something that I am able to do on both the Quest and AVP.

Unfortunately this does not make it a viable option for me. I think the platform has a lot of promise but I will be waiting for these features before developing on the glasses:

  • Testing directly on the glasses in the Unity editor
  • Support on iPhone for full mixed reality experiences
  • Capture of the entire experience (content and real world in color)

I don’t mean to complain, I’m just providing feedback in the hopes of influencing your roadmap. Once these features are there then I will certainly be back to develop on your platform.

Thanks for your help!

Thanks for your suggestions. We will try to improve the products and SDK.