Using host OS for screen perspective calculation is highly taxing on CPU/GPU and battery.
This is simply not the way to go for Productivity scenarios.
Nebula for MacOS is a perfect example of this as can be seen at 2min mark onwards of the following video(english subtitles at bottom) Multi-screen AR space based on mac OS：Nreal Nebula for mac - YouTube
Lenovo T1 Glasses are an example of OS independent productivity 2D screen mirroring.
This is achieved with zooming and shifting screen by glasses internally.
Trapezoid or curving transforms are not necessary in this case.
In other words, Glasses have enough processing power to independently facilitate screen zooming and shifting.
What is preventing Nreal from implementing this?
A difference in SDK is not a valid reason.
I personally have no interest in having any of my devices run apps to simply mirror Excel spreadsheets.
This issue is essential to be resolved through OS independent pathways.
NReal glasses doesn’t have processing power. They don’t have a built in CPU. It’s why it’s smaller.
I can’t seem to find a store page for it?? Is it available for purchase??
Lenovo T1 are just opening up for pre-order in China. Lenovo YOGA Smart Glasses T1 Released at 1,999 yuan, $285 (igeekphone.com) I don’t have the Chinese source.
Some foundation in electronics is necessary to understand that a “CPU” or a “GPU” is not the way this is done for a number of reasons which are mainly cost, heat, volume and weight.
Standalone VR headsets are an edge case that streams video over network created with a USB-C cable. This is more of a hack than a proper way to do this with real video signal. Ability to accommodate a much larger heatsink into a bulky VR headset is what makes this hack possible.
The way I see it, there should be 3 modes:
- OpenXR mode for identifications as an HMD - this will be available at end of year I believe.
- 3D Productivity Mode that hogs battery and CPU/GPU processing power like no tomorrow - this is how Nebula works today and it’s fine to stay that way as a separate use case.
- 2D Productivity Mode that is done entirely using internal DSP independent of device or OS the glasses are plugged into - this is how I want to use the glasses for work.
Could Nreal please comment on what is preventing implementation of this DSP powered 2D Productivity Mode?
Hi, the Mac Nebula is in 3D productivity mode, so based on OS and GPU/CPU is suitable for IMU computation. Theoretically, DSP-powered mode should be enough for computing, but I am not sure if this kind of way gives too much stress on the glass itself and causes the break.
Next gen Qualcomm XR chips to be announced next month may be able to take care of this without much heat.
Responsiveness does not need to be stellar. Just enough for image not to be stuck in one spot without jitter.
Would you guys be able to run some tests on current gen DSPs please?