MarekKowalski / LiveScan3D

LiveScan3D is a system designed for real time 3D reconstruction using multiple Azure Kinect or Kinect v2 depth sensors simultaneously at real time speed.
MIT License
749 stars 202 forks source link

Supported platforms and OS? #30

Closed ChangSurrey closed 3 years ago

ChangSurrey commented 5 years ago

Hello,

Great project!

I'm getting the impression that this project is currently mainly developed for use on HoloLens - does it support any other OS, say Android or IOS?

What I ultimately want to do is to implement this system on an e.g., Android or Apple device in a similar way to what people can see through HoloLens, i.e., the mobile device will use both its camera to display real-world image, as well as the point cloud read from LiveScan3D server. Do you think this is possible?

Thanks in advance! Chang

MarekKowalski commented 5 years ago

Hi Chang,

The project is written in Unity, with some code additionally implemented in UWP (separated by an ifdef) for HoloLens. It should work anywhere where Unity or UWP work. Having said that, I know there were some issues with shaders on iOS, but it seems that someone solved it, more details here: https://github.com/MarekKowalski/LiveScan3D-Hololens/issues/15#issuecomment-413046212

I think the project you have in mind is possible, but you will need a framework with something like SLAM (simultaneous localization and mapping) to track the phone's pose. I believe that iPhone's AR toolkit has this kind of functionality.

Marek

ChangSurrey commented 5 years ago

Thanks a lot Marek for the information!

Regarding SLAM engine, based on your experiences, which one would you say is better / offers better performance - iOS ARkit or Android ARcore?

Thanks Chang