Display a virtual object at the right location relative to the camera

Hi, I have a question to the Nreal SDK. When I have detected a real object relative to the rgb camera of the Nreal glasses and I have the 3d coordinates relative to it how can I transform an overlaying virtual object at the right place. I thought I take the NRFrame.headPose and transform the coordinates with this matrix, but then the virtula object is always about 2-7 cm (depending on the distance to the object) over the real object. Apart from the offset moving the real object also moves the virtual object the right way. My next guess was to also use the NRFrame.EyePoseFromHead since - as I understand - is the offset of the center camera to the nead pose. But what should I do with it? I multiplied the rotation of the EyePose with the rotation of the head pose in quaternions and I added the two positions and made a new transformation matrix out of it. This is more accurate but still way off. Can anyone help me with this problem? Has anyone accurately managed to overlay a virtual object over a real one?

The NRFrame.EyePoseFromHead refers to the relative coordinates of the node with respect to the Head (NRCameraRig).

@Elisabeth Hey - I found things were slightly off too - either to high or too low - I came up with my own rotation for the RGB camera, here are the values I use:

offsetTranslation: <0.01171373, 0.0001187575, 0.003732686> // (x, y, z)
offsetRotation: <0.1217845, -0.009724616, -0.009724616, 0.9924613> // (x, y, z, w)

And I use them like this:

NRFrame.GetFramePresentHeadPose(ref pose, ref frameTimestamp)
var position = pose.position + offsetTranslation
var rotation = pose.rotation * offsetRotation;

Once I create my camera view transform, I can get better accuracy on placement.