So I finally got the Beam today and did some testing.
It shows that the gyros, accelerometers and potentially compass included in the headset are truly excellent: the image in the loitering mode is truly very stable and won’t wander off, like I’ve seen in early not-quite-VR Google Cardboard inspired devices like LeEco’s LeVR, which had a tendency to wander.
So the headset’s potential seems to be rather good, but so far only seems to work for the fruity cult (Apple).
The Beam must have quite some processing power, as it is doing the transposing of the source image to the virtual screen, which might explain why the device is getting rather warm and doesn’t last very long. But I’m also guessing some older technology SoC, perhaps some 28nm Amlogic or Mediatek device. My phones should do much better, but unfortunately they don’t work with Nebula, while Screencasting is quality limited (see below).
Too bad it’s not quite capable enough to support 3D video, which only works on a directly wired connection without Beam or Nebula.
Miracast/Screencast video quality seems ok for THD movies (didn’t yet try what happens when the source is 4k), perhaps a bit of light gaming, but not for work or reading text in a browser. There is far too many artifacts and you really need to let the image settle for a second to read text. Officially that screen refreshes at 30Hz for 1920x1080 resolution, but even when I reduce the resolution to the point where the refresh reaches 60Hz, that’s not eliminating the artifacts or makes text readable until it’s settled. Do I need to mention that 30Hz scrolling on a 120Hz screen isn’t really what you’d expect?
In theory the ability to make the virtual screen bigger than the physical projection area sounds like a good idea, because I’m also moving my head to read on my 42" 4k displays. But in practice that won’t blurry the characters on a physical monitor, while it does so on the virtual screen inside the head-set. I couldn’t work with that without getting something akin to VR-sickness; not because of motion disorientation, but because of blurred screen content.
When using the Beam with USB-C screen pass-through, the Miracast/Screencast bandwidth constraints are nearly eliminated and the video quality becomes better. But scanning across a page of text is nearly identical bad with letters becoming hardly readable until you’ve stopped moving your head, completely defeating the advantage of the virtually bigger screen for work, at least with the Beam.
It’s especially frustrating as the movement otherwise is extremely smooth and the virtual screen is really rock stable. It can’t really be the OLED screens either, because they technically should have zero lag, so the only explanation I have is that the Beam isn’t capable of keeping up.
The Beam really limits what the display hardware can otherwise do (1920x1080@120Hz or 3840x1080@60Hz), simply because it lacks the bandwidth and processing power.
If all you want to do is watch THD movies on those glasses (no 3D), it’s ok. But desktop work is marginal (cable) or hardly possible at all (WLAN). I’ve not tried games myself, but the technical constraints are the same so the warning about artifacts I’ve seen from Youtubers seem very plausible, especially with WLAN streaming.
While the Beam demonstrates the quality of the headset’s sensors and hints at just how fine Nebula might work once available for PCs, the hardware limitations of the Beam don’t replace the need for a PC variant of Nebula.
Unfortunately that isn’t there and the fact that it’s been missing a finaly release for six month now, isn’t exactly encouraging. The April beta refuses to recognize the headset, even when I set it exactly as the error message and Reddit post tell me to.
I’ve tried with the Intel ARC 770m of my Serpent Canyon (which works just fine just in the “hardware display” mode), as well as an AMD Ryzen 7 5800U iGPU and an RTX 2080ti: the error message is always the same, while the non-Nebula mode works just fine, but only gives a single static display.