These are the things i wish i knew before purchasing nreal. The Nreal team can also use it as key factors to improve themselves as service or with their products. This is my list:
1 - Windows support for using AR features on Nreal Light Glasses.
2 - Cable adapter for charging phone while using Nreal light glasses.
3 - Better customer support and dev support.
4 - Active Nreal Community feedback
5 - Cast any 2D apps into MR SPACE for all other phones that can run nebula and Nreal Light Glasses.
If anyone has anything else to add, feel free to reply in this topic.
Agree there are many thing lacking… I can say I was one of the first who had the nreal glasses in the US imported from Korea and have not really seen any change in the business model. For developers to get interested there has to be a user base and for users to get interested there needs to be apps.
A workaround is to allow for the use of 2D Apps for all of us… Since the US launch there has been no movement in explaining how any user even Verizon users can use there android apps in MR Space.
The order of my list is :
Cast 2D apps
For Nreal to wash there hands and say that this is the Telco’s fault/Android limitations is insulting.
I just want better world reconstruction and plane/depth detection. Sure, hand tracking is neat and useful, considering the limitations of the 3-DOF phone as a controller, but I’d rather they put those efforts into understanding the world the user is looking at first, as that’s the biggest limitation as an AR developer, and one of the main things that enables true Augmented Reality.
I could live with phone-laser controls if the laser could detect if it was hitting an obstruction and had a rough idea how far that was.
nreal glasses do not work connecting with charging cable and data cable at the same time. I’ve tried it and it doesn’t work.
The native 2D apps only work with the Oppo Find X3 Pro 5G mobile, with other mobiles you have to enter from the web, Instagram, Facebook, Twitter …
Thank you for your comments. Our team is dedicated to build a better Nreal Community. Feel free to communicate with us!
We really need better room awareness.
I bought a few cheap old MagicLeaps recently and they provide the excellent tracking and spatial awareness that’s crucial for most development on any AR platform.
I really would like to see something on Nreal (or other birdbath glasses) that comes close to this. Will we have to wait for the new chipset to get better SLAM or will the current glasses get a useable update as well?
Agreed! Magic Leap includes a dedicated depth sensor, similar to Microsoft Kinect it projects a bunch of IR dots around the room and has an IR camera for detecting it. Since NReal doesn’t have that, you won’t get as good room awareness without a new hardware design (and considering their newest glasses don’t even support 6-DOF motion tracking, I’m not optimistic of getting better AR glasses from NReal anytime soon).
Now, other devices have been able to get pretty good realtime depth sensing with just a couple RGB cameras, so there is still hope for a good software algorithm to improve NReal’s room awareness, but I’m sure it’s not an easy solution - the one I’ve used that’s best is made by a company (Stereolabs) that has a history of creating depth maps from a single camera (they used to convert 2D Hollywood movies to 3D), so they’ve had lots of time to perfect their algorithm.
Yes, wanted to buy one of the ZEDs for a while now, but they always manage to make me confused by their versions and generations carrying the same price tag
i would expect Qualcomm to deliver an excellent experience. They’ve got the manpower and engineering mindset and seemingly a business model to pay for all that.
My iPhone does well for quite a while now(with some lag and probably lots of bright people working for years on this) even without the LIDAR and the MagicLeaps’s has limited reach and resolution and seems not to be used for most if the time (might be wrong here)
i just wonder if my current Nreal will just be a pair of video goggles application-wise once the new platform is out. An alternative would be a low level sdk that will just allow us to run existing solutions on the sensor and control device tree.
So more communication from Nreal would be welcome despite the will to own and control everything businesswise. Bit of cultural discussion, about what you might get in return by the community for giving up some tight control of underlying secrets.
All that under the assumption that Magic Leap (the company) won’t be available for a low price in the near future
More options to control Display - headset only or Dual. lock the screen angle/display without dependency on the orientation of the phone
A generally improved tracking system. Image, position, and hand tracking work, but is really lacking compared to say the Hololens.
It feels as if they have a lot of features, but dont really finish any of them. I love the device, but the lack of accuracy in all of the system keeps me from actively developing & using the device.
Also, being part of a good ecosystem. I hope they will properly support the Snapdragon Spaces system, so we can get a solid SDK that can be easily used by all future android headsets on the market.