I’m using SAMSUNG S10(snapdragon vesion), NRSDK 1.9.5
When I capture photo via RGBCamera, it takes about 0.45s to get the final image in Texture2D format and I can feel a significant drop in FPS, causing a huge amount of user experience issue.
I modified the PhotoCaptureExample in NRSDK/Demos
I record a timestamp and it takes about 0.45s to get the final image, that’s way to long!!!
I’m wondering is there any way to reduce the case/put the photo capture in a new thread? Or anything I can do to make it faster and more efficient~
The two methods in the screenshot are used to save images. If they are not needed, they can be commented out. These two methods take approximately 200ms, and hopefully this can solve your problem.
Am I able to get byte from camera directly instead of Texture2D~ I want to transmit the captured image to server~
We didn’t provide the interface currently, but I think the
data shown in the second screenshot is what you need, and you can make some modifications.
I found a new problem that even getting byte fron NRRGBCamera will cause the FPS reduces for a short time, from about 60FPS to 3FPS. Is there any method to capture image(byte) without much influence on performance?
I found the image conversion consume a lot of performance, is that the reason?