Open antoinemacia opened 5 months ago
Hi,
You should have access already to odometry with Apple Room Plan. For loop closures, not sure what they do. The RTAB-Map iOS app was not meant to be a framework, just an App. The RTAB-Map library under the hood could be integrated in other apps though. If you don't care about online mesh/point cloud rendering, you may be able to create a Rtabmap object on c++ side, init with an empty database (and Kp/MaxFeatures=-1
parameter to disable feature extraction) and call rtabmap.process(SensorData(...))
with data coming from ARKit (maybe changing this line to rtabmap.process(data, pose)
instead). That would be the minimal CPU required to save data in format that rtabmap can easily reprocess offline (with rtabmap-reprocess
and rtabmap-export
CLI tools).
cheers, Mathieu
Great, thank you for your prompt answer! I'll have a go at it this week
Hello hello!
First and foremost, thank you for this amazing library/toolkit, it is truly amazing what it is capable of!
I'm trying to integrate it as part of an exiting app that uses Apple Room plan. What would you suggest as the best way to integrate the odometry in an existing application using Room Plan ? I am trying to transform it to a package or framework but struggling with some of the native dependencies
Ideally we could do everything on device, but conscious the object detection and meshing happening during the same session is the recipe for hot hands! Alternatively we've also been thinking of extracting frames, camera intrinsics, etc and processing server side - is there a wiki/tuto/best practice on extracting what's necessary from an ARKit session for server processing via RTABMap
Hope it's clear, thanks!