The audio streaming unfortunately has a delay of 2-3 seconds and then seems to play faster than in the app, almost like it is trying to catch up the delay. The video content is almost realtime, which is nice!
detected planes are drawn as opaque black areas. This did not happen with the previous version.
Thank you for sharing beta package. I’ve tried it but I have an curious trouble with this.
I started a streaming receiver app on Windows and Nreal app that is used the beta package. After that I clicked a button to start streaming, but a dialog appeared telling me “a camera is disabled. Please ask the problem at a FAQ.”
But I’m wondering because I used an NRRGBCamera in other nreal app that I made. The app can launch the camera but the app that uses this beta package can’t start the camera. What is wrong?
The app developers that helped us with our app say that the first person view does not work as described in the SDK and that they will have to build the First Person View feature from scratch for us at a substantial cost.
Not being technically minded, I can’t argue with them and so I am hoping the community can give some feedback as to how they have been able to use this feature in their apps.
For us having First Person View is a critical feature for use during product demonstrations.
Agreed that screen streaming is a critical feature, it would be great if we could stream to a browser over port 80 or ssl. The current solution uses an uncommon port that is usually blocked where we demo our product and streaming is necessary for training purposes.