introlab / rtabmap

RTAB-Map library and standalone application
https://introlab.github.io/rtabmap
Other
2.61k stars 763 forks source link

iOS app architecture #1138

Closed wetoo-cando closed 9 months ago

wetoo-cando commented 9 months ago

Great work @matlabbe and thanks for making it available to the community.

I am trying to build an iOS app that uses the color and laser data and in which most of the libraries are python/C++. Since rtabmap already essentially does that, it would help me a lot to understand how the iOS app works. Of course your open source code helps. But since I'm new to iOS development, I'm wondering: is there some high level documentation that describes how the iOS app interacts with the main rtabmap C++ library. If not, could you please describe it here in a few sentences?

Thanks!

matlabbe commented 9 months ago

There is no documentation about the Swift to C++ interface. Basically, swift calls Objective C functions that calls C functions. We pass the c++ rtabmap object as a pointer to C functions.

In summary the swift code is mainly UI stuff, calling the c++ library under the hood. For ARkit, it forwards the pose and image data to c++ library, which does the loop closure detection and the opengl rendering in c++.

We use the same opengl backend for the android and ios app.

wetoo-cando commented 9 months ago

Thanks!

In the ios code, you include the RtabMapApp.h header. Is this the same as rtabmap/app/android/jni /RTABMapApp.h?

Or is this a header that somehow gets automatically generated during the ios build?

matlabbe commented 9 months ago

ios build is including rtabmap/app/android/jni folder. I didn't put time to move that code in a shared folder between android and ios.

wetoo-cando commented 9 months ago

Ok thanks!

Gyudori commented 4 months ago

Hello, I'm new to iOS app development and curious about the main architecture of iOS RTABMap. It seems that the RTABMap iOS app(without lidar) doesn't heavily rely on core algorithms like odometry and loop closure from the RTABMap core. Can you confirm if this is true and clarify if the RTABMap app primarily uses outputs from ARKit? Thank you for your assistance in helping me understand this better.

matlabbe commented 4 months ago

RTABMap iOS app(without lidar) is using these inputs from ARKit: pose, RGB frame and 3D tracked features. Loop closure is still done by RTAB-Map library (we reproject the 3D tracked features in RGB frame to extract the corresponding visual descriptors), as well as the 3D mesh reconstruction/texturing.

Gyudori commented 4 months ago

Thank you so much for the prompt response!!! I will check based on the information you provided.