I checked for the Nebula release on the Xreal home-page regularly: âcoming soonâ before the return window for the headsetâthat includes supporting software as a packgeâexpires is a ânot soon enoughâ.
I only hit across this release pretty much by accident: not ok.
I had high hopes, but Iâve requested a return after testing this software with five of machines I have in my home-lab: I donât expect it to reach satisfactory performance any time soon.
Satisfactory im my book would mean better than what the Bean box can offer, with hardware that pretty much by definition must be less powerful than âcurrentâ iGPUs or dGPUs.
Yes, modern mobile SoCs from Qualcom may actually beat Intel integrated HD graphics up to Ice Lake. But a 96EU Xe iGPU at 5 Watts dedicated power should at least equal the Beam or offer a smooth single virtual screen at 72Hz, because thatâs a max mobile SoC power budget only for the iGPU part and even Intel isnât that bad. Actually AMDâs GCN5 in Cezanne is pretty near the same league and 3DMark seems to prove me out, too: these iGPUs should be good enough for that job, you shouldnât need an M1Ultra or a >100Watt dGPU for the job.
Since the Air²/Nebula require both, a USB-C port with Alt-DP and a powerful dGPU best from Nvidia, I could not test with my most powerful systems, because they dropped VirtualLink support and only offer DP and HDMI ports. AFAIK a cable simply canât do the job of joining USB and DP into a USB-C/Alt-DP so Iâve ordered a TB PCIe extension card, that Iâll combine with an RTX 4090 on a Ryzen 9 5950X for the most powerful base platform I can use for testing. Itâs an extra âŹ80 to prove a point, because I donât think hardware performance is the real issue judging by virtual desktops elsewhere or in fact the Beam.
I was limited to the following systems, mostly Windows 11, some Windows 10, latest CUDA 12.3.2 drivers for Nvidia, OS updates etc., I had bought an Air² Pro with Beam bundle.
- RTX 2080ti (in a Broadwell 22-core Xeon), which has a VirtualLink USB-C port and quite a bit of graphics power, 60% more than the RTX 3060m mentioned in the laptop the Xreal developer seems to use
- Lenovo laptop with Ryzen 5800U and GCN5 iGPU, two USB-C with Alt-DP ports
- Asus laptop with Alder-Lake i7-12700H, 96EU Xe iGPU, one USB-C/TB/Alt-DP port
- Intel Enthusiast NUC11 Phantom Canyon, Tiger Lake i7-1165G7 + 96EU Xe iGPU + RTX 2060m dGPU, dual USB-C/TB3 is connected to Xe (hybrid mode operations required), dGPU is connected to DP and HDMI
- Intel Enthusiast NUC12 Serpent Canyon, Alder-Lake i7-12700H + 96EU Xe iGPU + ARC770m dGPU, iGPU connected to USB-C/TB4 (hybrid mode operations required), dGPU is connected to DP and HDMI ports
The first thing the Beta Nebula software will do is overwrite the Air² firmware. If you then connect it to the Beam, the Beam will overwrite the Airâs firmware with its own older variant. And so it goes on, which is a bit scary, since firmware updates are often a high-risk operation and I simply donât know if and when the two will agree on a single versionâŚ
Most of my systems have been used with a plethora of monitors before, the RTX 2080ti runs on two 4k screens, one at 144Hz and occasionally with an Oculus Rift CV1 or a wireless Lenovo Android VR headset: in other words, there are plenty of monitor settings that Windows remembers for each of the systems and that can be a problem for Nebula, which creates extra screens for the logical screen content and for the simulated monitors: Windows can then get confused where these are then logically located and which of those are actually active. And switching from 72 to 90Hz seems to create new layouts causing additional challenges. Iâve had to make sure that all screens are used in extension mode and sometimes the time you get to make these changes was too short before I could confirm their validity and Windows would reset themâŚ
Long story short: I started with the less powerful devices, #3 the notebook with Xe graphics first. When that device delivered around 10Hz refresh on head-movements even with a single 72Hz virtual screen, I feared that more graphics power was required.
But after quite a bit of wrangling with the RTX 2080ti system, which doesnât want to use better than 60Hz refresh on any but the primary screen, and after testing all the other devices, Iâve come to the conclusion that the 10Hz refresh isnât related to lack of GPU power. I used GPU-Z and HWinfo to observe CPU and GPU loads and the Nebula app doesnât stress and GPUs, why the refresh rates are so abysmal I have no idea.
I validated that enabling a fixed vertial sync on the GPUs was required to stop terrible visual artifacts on the headset, but didnât deliver anywhere near the smoothness the Beam can deliver (which has its own challenges with the text on the virtual screens)
I was really disappointed to find that the NUCs with the hybrid graphics wouldnât work at all, because they represent a rather large class of notebooks, who are designed in a very similar manner.
Some years ago, notebook designs with optional dGPUs had to physically switch output ports between the iGPU for low-power desktop work and the dGPU for gaming or graphics. Then GPU vendors developed a hybrid approach where dGPU output ports werenât actually phyiscally connected but much like data center GPGPUs would only render into their frame buffer. The iGPU would then read from that frame buffer which was mapped into the SoCs virtual address space and copy its content at screen refresh rates to its own frame buffer, which would then feed the serial display outputs. It sounds computationally expensive and âdouble workâ, but turns out such a light workload, that vendors felt it was worth saving money and integration testing on the switch. The actual performance difference or lag is minor, but measurable, so higher end gaming designs tend to also offer direct DP or HDMI ports.
In the case of USB-C/TB/Alt-DP ports this approach seems to save even more hardware validation and other overhead, which probably explains why most such universal USB-C/TB/DP ports only phyically connect to the iGPU.
So even with some of the high-end notebooks and NUCs the only port that matches the Air requirements may not be physically connected to the big GPU. And in the case of Nebula, that means it does not currently work. The virtual screen stays dark and the only sign of life is when I use âidentifyâ in the Windows settings, where the monitor number is visible in the headset. AFAIK thatâs more of a software bug than app design error, but without details I canât exclude that (worst case) possibilty, where Xreal have not done their research on what the Windows platform can support at the required fluidity.
The software is much more a Unity app than a driver. In theory that should provide device independance and work with any hardware.
In practice the app seems to use terminal service facilities to create screen and then use RDP API calls to do what the hybrid graphics are doing, too. It creates one or more âin-gameâ monitors, which the applications write to, then scrapes these âin-gameâ monitors via RDP shadowing APIs or similar, projects and transforms them onto 1-3 virtual or a two variants of bendable monitors in a space you can view using the Air in âstereo modeâ.
Compared to your typical game, this is a relatively modest workload, but it involves quite a bit of OS shenanigans and overheads, in an environment that is rather unforgiving in terms of real-time demands. The hybrid drivers I mentioned above work in a privileged OS kernel context and do not require user-mode services or foreign APIs like RDP to work: itâs a somewhat unfair competition.
Yet other VR headsets manage virtual desktops without being any more intelligent, Iâve got VR desktops bent and shaped in many way operating quite easily on everything that runs Oculus or SteamVR, but only mirroring the single primary screen.
So far none of them tried extra screens and perhaps that is an area where Microsoft doesnât offer some fast path that the VR desktops might be able to use⌠I am pretty much wild guessing here, as you might be able to tell.
But it seems to indicate that Xreal is simply not putting the resources into writing and testing the app, which are required to get the job done. Of course thatâs difficult for a startup, but at the current rate of progress, Iâd see the Air² breaking from firmware reflashes or old age before they become usable, which is why I am trying to return mine after the first glimpse of software⌠and quite a few hours of testing.