Streaming from AR Glasses to Laptop

I am trying to stream a sample scene from my ar light glasses to my laptop. I have looked into Unity’s Render Streaming, but I have no idea why it is not streaming to the browser at 127.0.0.1. Can someone help?
This is the tutorial I generally followed: Creating The Scene | Unity Render Streaming | 3.1.0-exp.7

I used nrsdk’s HelloMR scene and added some components into NRCameraRig, but I believe I may have missed something, as the screenshot below:

I loaded my apk into my glasses, and I expect that I can run my ar glasses as my server to broadcast to my laptop, but I have no clue how.

Note: I am using the computing unit with nrsdk 1.9.5

Hi,

have you tried this demo?

Hello, I will try the demo and let you know if I run into any problems. Thank you!

I also wish to ask that if it is possible for the default black background can be turned transparent? From what I can see from the demo projects in nrsdk, all seem to have a black background when loaded into the glasses. How can I change the background into a transparent one?

I’m a bit confused about the specifics of your question. If you are referring to actual usage with the glasses, the black background is used to allow you to see the real-world scene behind the virtual objects. In actual usage, the black background effectively appears transparent.

If you are referring to recording or streaming, you can see this in the RGBCameraRecord demo. When you select the “Blend” mode for recording, it combines the real-world scene with the virtual objects, effectively making the background transparent. If you select the “Virtual” mode, it will record with a black background.

Hello, I’m providing an example of what I wish to achieve in a screenshot below. I want to be able to see the real world layout while still seeing the assets.

Can the glasses have the background clarity akin to an AR app on a phone?

Yes, but since the Light’s RGB camera resolution is only 720P, the quality might be limited. You could try recording a video yourself to see the results, or check out this video for an idea of the output:

1 Like

Few questions:

  1. Are you using the Experimental version of the SDK? This has the streaming feature in it. If you’re not, find the experimental version of the SDK you’re using and use it instead. You need the streaming prefab from there.

  2. Are you using the XReal Streaming Receiver on your laptop in order to record the view from your glasses?

3: Are the glasses and the laptop on the same wifi network?

If any of that confuses you, check out his page in their documentation: https://xreal.gitbook.io/nrsdk/development/tools/first-person-view

With regards to the black background, black is transparent. The same if you have a black background in Photoshop/After Effects and make it an additive layer. The black won’t show in the glasses, or when streaming from the glasses to your laptop.

2 Likes

Hello, I am looking into the link you provided, thank you. I noticed that there are some missing images from the tutorial, are those important? @Dorix

Also, to answer the questions:

  1. I’m using nrsdk 1.9.5. I assume this is different from the experimental package?
  2. No, I am not. Is this included in the streaming receiver?
  3. Yes.

I think the images were just a visual of what the text’s already saying.

1 - If your SDK doesn’t say ‘experimental’ in it, that’s not the experimental version of the SDK. It would look like ‘NRSDKForUnity_Release_Experimental_1.10.2.unitypackage’

2 - the exe is linked to on that page. In your screen grab you can see the link.

So all you’re doing is using the experimental version of the SDK which has a streaming prefab in it. You drag that into your heirarchy, build and install the app, then open the streaming exe.

I’ve found if you’re recording the streamed video, it doesn’t look as crisp as in the glasses themselves, but it can be a uiseful tool when it comes to showing cients progress without them having to install a file each time.

2 Likes