ibaiGorordo / ONNX-msg_chn_wacv20-depth-completion

Python script for performing depth completion from sparse depth and rgb images using the msg_chn_wacv20. model in ONNX
MIT License
22 stars 1 forks source link

Getting dense depth map result on a mobile device #1

Closed ofirkris closed 2 years ago

ofirkris commented 3 years ago

Hi, my goal is to experiment with CoreML\Tflite to get a dense depth map using your ml model. my idea is to provide the input image + depth map from https://github.com/FilippoAleotti/mobilePydnet / https://github.com/3dify-app/3dify-ios (Mono depth model)

Will this work as I expect? or do I need an additional input?

ibaiGorordo commented 3 years ago

In theory yes, but one problem with the monocular depth estimation models is that it only provides "normalized" depth. If you want to get the actual depth (which I guess you do), you need to calculate the scale in every frame. And this model will not be able to help with that because you already have a dense depth map. You could get this scale if you had some sparse depth points (from some low resolution depth sensor) by performing a regression between the sparse absolute depth points and the "normalize" depth at those points.

The way I see this model, it could help to get the absolute dense depth map directly from the sparse absolute depth without the need to do any regression, by filling the sparse depth map. So, it is more like a replacement for those models.

I have also uploaded an example for tflite in case it helps: https://github.com/ibaiGorordo/TFLite-msg_chn_wacv20-depth-completion