We understand that the first person view streaming can be of much utility. We do have these tools in hand, although still under beta phase.
Please find the following link for a guidance:
First Person View - Recording: Video Capture — NRSDK 1.7.0 documentation (nrealsdkdoc.readthedocs.io)
First Person View - Live Streaming:
First Person View — NRSDK 1.7.0 documentation (nrealsdkdoc.readthedocs.io)
Things you will need:
- NRSDK 1.7.0 Beta:
NRSDKForUnityAndroid_Beta_1.7.0.unitypackage - Google Drive
- Streaming Receiver (necessary for live streaming)
StreammingReceiver_v1.2.0.zip - Google Drive
- Hardware: PC + Adapted mobile phone or Compute Unit + Nreal Light
- SDK: v1.6.0_beta and above
Any feedbacks are welcomed! Thank you so much.
Thank you for this, it really is important.
Is there a way to transmit back audio to the user with Nreal glasses?
Hi @Nreal_JosephLiu I have feedback for the NRSDK1.7.0 FPS:
- The audio streaming unfortunately has a delay of 2-3 seconds and then seems to play faster than in the app, almost like it is trying to catch up the delay. The video content is almost realtime, which is nice!
- detected planes are drawn as opaque black areas. This did not happen with the previous version.
Thank you for sharing beta package. I’ve tried it but I have an curious trouble with this.
I started a streaming receiver app on Windows and Nreal app that is used the beta package. After that I clicked a button to start streaming, but a dialog appeared telling me “a camera is disabled. Please ask the problem at a FAQ.”
But I’m wondering because I used an NRRGBCamera in other nreal app that I made. The app can launch the camera but the app that uses this beta package can’t start the camera. What is wrong?
Please disable the Power Saving Mode in Nebula and try again to see if it still happens.
The issues are noted and thank you very much for your feedback. Hopefully, you could see it fixed in the future releases of NRSDK.
Is that possible to run the streaming over differents networks? Like connecting the smartphone on a 5G network and the pc on another network
Thank you @Nreal_JosephLiu. Do you have a timeline when we can expect the fixes of the FirstPersonView Streaming?
Any news about a timeline? This is a key feature
Is there an update on this issue?
The app developers that helped us with our app say that the first person view does not work as described in the SDK and that they will have to build the First Person View feature from scratch for us at a substantial cost.
Not being technically minded, I can’t argue with them and so I am hoping the community can give some feedback as to how they have been able to use this feature in their apps.
For us having First Person View is a critical feature for use during product demonstrations.
The issue seems solved in the latest V1.10.0 experimental version of NRSDK.
Thank you for the reply!
So, the feature allows for live streaming to a nearby device with sound?
And to be clear when I talk to them, can you please give me the code/script/instructions they would need to drag into Unity in order to activate this?
They previously enabled the recording feature for us, so is this pretty much the same thing to activate the streaming?
And, does it work for both Nreal Light and Nreal Air?
Thank you so much!
Agreed that screen streaming is a critical feature, it would be great if we could stream to a browser over port 80 or ssl. The current solution uses an uncommon port that is usually blocked where we demo our product and streaming is necessary for training purposes.