Hello
I would like to use Nreal Glasses for an AR guided tour of our city, where every person will see a series
of videos anchored in space, and the tour guide will give more info and context.
I have a demo, made in Unity where first u go around placing cubes, to be used as placeholder for the videos. This works very well in small enviroments like an office or a home.
But when we try this solutions on the actual place, the cube are placed fine, but when going back to the starting point i have noticed that the camera in Unity is not at the starting point, .
It look likes there are some errors accumulating that will make the real position differ from the one in Unity.
My first solution was to use the GPS, but the Devkit has no GPS , and i think also the Nreal Light won’t have a GPS.
I am not sure how to solve this problem, i can’t find any similar problem on the forum.
Could this be solved by software ?
Is this an expected behaviours for the glasses ? maybe they have a max range.
Could it be a problem with Unity and the distance from the origin ? I don’t think this is the case
Thanks
Hi developer. This issue should be caused by limited storage of spatial points in Light glass. Our Light can only save a small amount of cloud points data. So you may notice that the starting point is different when the camera returns back to the start.
Currently, if you want to do AR guide of your city, then you should have a server that stores the position info of your city, then push the data to glass for localization. We don’t have such server currently.
Hi thanks for the response but i have a few questions
What do you mean with Cloud Points data ? I thought that the glass was using the accelerometers to track his position in Unity space.
Is there an API to manage this points ?
For “info of your city” do u mean cloud points or GPS positions ?
At the moment we could not use a Server , cause the Wifi cover is awfull. But if i have a way to save and load this data , i could probably find a solution
Hi. Sorry for misleading you. I will try to clarify for you:
As you know, the localization function in our Light is determined by the SLAM algorithm which generates feature points (cloud points) according to the SLAM camera. So it is not only using the accelerometers to calculate position but also needs feature points to recognize surroundings.
And the ‘info of your city’, not only GPS data, but also the feature points of the city, then the glass can overlay AR objects on actual surroundings. And our glass can only save a little data and have no GPS function. So…there is no better solution only from our side.
Good to know, is there a way or an API to manage this cloud points ?
No, the API is not accessible in current SDK.
Hello i have a few more questions regarding this issue
1 Is there any functionalities that require the use of Nreal Light with the Dev kit base, instead of a supported Android device ? Because this way i can use the GPS of the device.
2 If there are no API or possible way to menage this Cloud Points, what other solution can i use to solve the problem ?
3 also this Cloud Points in what memory are saved ? Nreal Light glasses or the Android base ? Ram or on disk ?
Thansk
Hi, please check the following answers.
- No, there is no function limitation with using an Android phone;
- Maybe you can try a third-party cloud point service.
- The data is saved in Light RAM temporarily.