Floating Display feature of Lenovo T1 Glasses

Hi,

I have a question about the recent release of Lenovo T1 Glasses.
I was surprised to discover that these glasses provide a floating display feature whilst being detected as a simple external monitor on Windows/Mac
In other words, display changes position based on 3DoF head movement without using GPU/CPU power of the PC.

As per the news article:
“On a Windows laptop, it made slightly more sense. Windows detects the Glasses T1 as a second monitor, and you can use a button on the Glasses to “lock” the screen in one location so you can turn your head away from it.”

This feature is great news and is precisely what I am looking for when it comes to 2D Productivity on PC.

Since GPU/CPU on PC are not used, is this implemented solely using DSP onboard the glasses?
When will this feature will be available on Nreal Light or Nreal Air or both ??

Much Thanks

2 Likes

Well there’s news that Nebula will be ported to MacOS in September which will allow to up to 3 floating screens, but no news as of yet for the Windows :kissing:

2 Likes

Thanks Mr.FarePlay

I have seen Nreal comment that OpenXR will be supported by end of the year.
My guess is this will be when virtual displays are available on Windows.
OpenXR generally means that glasses are identified by OS as HMD(not as a standard monitor).

However, what Lenovo T1 is doing is distinctly different.
There is no app like Nebula. Glasses are identified as a monitor(not as HMD). Only one floating display is shown and all head lock/head unlock calculations are done by glasses alone independently of PC.
And this is exactly what i would love to see on Nreal i.e. head unlocked Windows screen without any apps running on PC.

Could @XREAL-dev please confirm if this is coming to Nreal Air or Light or both?

1 Like

Oh that sounds like the glasses have their own CPU like the ThinkReality 3. If they can do all that independent from the device they are connected to.

If that’s the case I don’t believe NReal can do that either seeing that they don’t have a built in CPU. At least not without an app like Nebula :thinking:

Its not the CPU that does this on the glasses but the DSP.
All Qualcomm based glasses have a DSP built-in.

DSP is involved in hand tracking computation.
In the case where glasses have no cameras and, therefore, no hand tracking(Nreal Air or Lenovo T1) DSP is free to do things like screen perspective calculations.

I do believe this is possible on Nreal Air.
It should equally be possible on Nreal Light, however, firmware may need to be re-flashed and hand tracking may no longer work.
Ideally, i would like the freedom/ability to switch between hand tracking and floating screen firmware at will.

Could @XREAL-dev please let us know if DSP based floating screen feature is coming to Nreal.

I’m just like you looking for some that work for work (video calls, remote desktop, barcode, AR, MR), I have the meta quest 2, nreal light and the vuzix blade, and I’m waiting for Nreal to work with this type of applications of work, if not, keep looking

Hi, I think the SDK framework of Lenovo glass and Nreal glass is different, so the solution is different. And we don’t have such a plan to fix the virtual screen just by glass instead of an app currently.

Thanks @XREAL-dev

Bottom line for anybody doing productivity work is they need to preserve battery for as long as possible.

The app solution would not satisfy as it would quickly drain the battery as if you were playing a 3D game whilst using excel.

What would need to happen for your plans to change?

@nearreal

You do realise that regardless where you put the processing of the head tracking - the glasses or the PC - the power use of that will still be there?

Not to mention that the Air doesn’t really have a comparable DSP. It’s basically “just” a USB hub, a DisplayPort to MIPI adapter (to convert the incoming signal to the actual displays), and pretty much everything else is accessible through the said USB hub functionality (accelerometer data, speakers). It’s very far from the processing powers of the Snapdragon XR series.

Also, the 3D calculations and transformations of the windows would not be done with the DSP. The DSP is great for 2D array manipulation, but for 3D stuff, you’ll need a GPU.

Hi @fonix232

As I understand it the DSPs that come with Qualcomm XR chips are used for hand tracking, hence, I wouldn’t rule out the possibility of 2D image manipulation i.e. I am referring to 2D productivity use case that does not involve true 3D.
DSP could also simply drive the MIPI display controller by shifting and zooming the screen based on tracking sensor data without applying any perspective to it. I would be perfectly happy with that for 2D Productivity. I wonder if Lenovo T1 actually does just that?

Moving onto power: even if we do resort to using a small ARM based GPU embedded onto glasses, the power consumption would be orders of magnitude less compared to doing the same thing with a laptop or desktop GPU.
I am not even mentioning unnecessarily using up laptop performance resources just to display a 2D image of a spreadsheet.
Using laptops/desktops for 2d productivity is just a big no.
OpenXR is a different story.

Could @XREAL-dev please confirm which chip the Air is based on? Is it Qualcomm XR1 ? Does it have a built-in DSP or a GPU?