Closed ketourneau closed 2 years ago
AR Foundation is an abstraction layer; it uses the data from the underlying platform (ARKit in this case). When reviewing your video, we noticed this room is empty with monocolored walls, which can cause the platform to struggle to track the environment.
Apple provides some guidelines about World Tracking here: https://developer.apple.com/documentation/arkit/configuration_objects/understanding_world_tracking#2891890
For a more reliable tracking experience, I might recommend using image tracking, then once the image is tracking, create an anchor at that position. Using image tracking allows your users to relocalize if tracking is disrupted while walking around.
That seems particularly bad, even for a room with less than ideal conditions. Can you try Apple's sample and compare the results on equivalent hardware?
If Apple's sample produces similar results, you should raise the issue at https://developer.apple.com/bug-reporting/
Thank you for your answers. We reproduced drift problem with Apple's sample and we raised issue at bug-reporting.
Apple replied : "Lidar devices should work fine in that environment." and we sent additionnal data for debuging (sysdiagnose).
Wait and see
Describe the bug We currently use ARFoundation to superimpose BIM model in world scale AR. We use plane detection to detect a real angle and to place an anchor.
Many of our customers have problems with plane detection (like in the attach video), the planes move a lot (drift ?). https://user-images.githubusercontent.com/39763266/165557856-69676f73-2fc9-48ae-9016-2767ef32fdbe.mp4
To Reproduce Steps to reproduce the behavior:
Expected behavior Planes doesn't drift and AR is stable.
Actual behavior Planes drift.
Smartphone (please complete the following information):