Closed patriciogonzalezvivo closed 3 years ago
Thank you! I apologize for the late reply. Adding support for position tracking would not be easy to do for e.g. the FaceID, so adding this feature has a lower priority on my TODO list.
If I would not reply to you on Github for a longer period of time, then please send me an email at support@record3d.app — I do check that email regularly.
Cool. I totally understand! Thanks!
I would add a huge +1 request for camera extrinsics!!! Only for the lidar/back camera would be SUPER helpful and awesome. I think this should be easy to implement in ARKit, e.g. following this tutorial https://developer.apple.com/documentation/arkit/environmental_analysis/visualizing_a_point_cloud_using_scene_depth.
I would be happy to contrib if I can in order to get this implemented.
Also emailed you. P.s. thanks for app. Downloaded and bought all the versions/extensions today. Super useful
Note in the new release (released today), @marek-simonik has added the camera extrinsics (poses) to get to world co-ordinates!! However, note this is only for the LiDAR (back camera) and doesn't work for the front camera. Super helpful! Huge thanks @marek-simonik
This is fantastic news! Thank you @marek-simonik !!
No problem, I'm glad I could help :).
I wonder if this means the data will be useful as an input to Photogrammetry apps such as Reality Capture. My previous attempts at using iOS LIDAR data has been hampered by the lack of pose data.
Let me start with that this is a great app and the API is a bliss to work with. I was wondering if it could be possible to send some camera coordinates like relative XYZ for up/left/forward vectors. Not sure if ARKit provide you with that data, but I think would be very interesting to do some virtual productions and AR across devices.