Using two capture methods fails

Hi,
I have an application that uses the CameraProxy to capture images that are passed on to another module which does some image processing on them:

m_NativeCameraProxy = CameraProxyFactory.CreateRGBCameraProxy();
m_NativeCameraProxy.SetImageFormat(CameraImageFormat.YUV_420_888);
m_NativeCameraProxy.Play();
and later
var frame = m_NativeCameraProxy.GetFrame();
... do further processing with frame
This works fine.

Now I want to add RTP-VideoStreaming to let others see on a PC what I am seeing through the glasses. For streaming I use basically the VideoCapture2RTPExample which internally uses NRVideoCapture.StartVideoModeAsync() to get an RGB image.

The problem I have now is that when I use both methods at the same time, the image that is captured from the NativeCameraProxy looks totally crappy. Just as if the raw image data was interpreted as a pixelformat that it is not (e.g. raw data is rgb, but it is interpreted as yuv):

Therefor my computer vision algorithms fail on those images.
But the stream that I receive at the PC looks fine.

I have two questions

  1. Is there a way to use both capturing methods at the same time without messing up one of the images?
  2. if 1) is not possible, how can I achieve to get images that I pass to a function and at the same time stream them via rtp (with AR overlay rendered) ?

Thank you in advance!

I did some further testing and tried to create a minimal example.
These are the steps to recreate:

  • Open the RGBCamera-Record demo scene (NRSDK/Demos)

  • Add the TextureCanvas from NRSDK/Demos/RGBCamera to the RGBCamera-Record scene

  • Change the VideoCapture2LocalExample.cs script to “save” the video to an RTP path (see Display casting)

  • run the scene on the Nreal glasses

  • start the stream

In my case streaming and the virtual display work at the same time.
But if I stop the streaming while the TextureCanvas is displaying the camera feed, the app crashes.

It’s not exactly the same issue that I have in my real app, but it shows that there is some interference when using multiple camera captures :-/

I found the issue. My way of setting up the NativeCameraProxy and SetImageFormat() does not update a variable in NativeCameraProxy (m_ActiveTextures) that is used in “GetActiveCameraImageFormat()”.

The following call to NrVideoCapture.CreateAsync() checks for GetActiveCameraImageFormat() but that function returns the default value, which is RGB.

I fixed my issue by setting the default value to YUV_420.
This is not really a fix, but at least a workaround that works for me…

@XREAL-dev: I think after using SetImageFormat(), the function GetActiveCameraImageFormat() should return what I set before, not the default value…