TRI-ML / packnet-sfm

TRI-ML Monocular Depth Estimation Repository
https://tri-ml.github.io/packnet-sfm/
MIT License
1.24k stars 243 forks source link

Two-stream ego-motion #37

Closed YJonmo closed 3 years ago

YJonmo commented 4 years ago

Hi there,

Thanks for this great work.

I am trying to apply the two-stream ego-motion network to map the internal part of the shoulder when recorded during the arthroscopy. So far, I tried to use the pose and depth network separately. The video frame from the internal part of the joints have serious problem with the lack of textures. So I pretrained the Mono Depth network (Godard 2017) with high texture frames environments first and then trained it with the arthroscopic images, and I achieve some success in getting the depth for the arthroscopic frames. However, with the pose Network I had little success, yet. So far, I trained the pose network in a supervised manner where the coordinates of each frame is known with respect to the center (Absolute pose) and the it only works well when the same joint is being used.

I was thinking to use the two-stream ego-motion network which gives me the relative pose of the target image I_t with respect to the source image I_s. But I am guessing this may not work for my case as in the arthroscopic videos there are occasions when the whole frame is being suddenly obscured by a tissue and at the same time the camera is moving. In that case, I guess there will be no similarity between the I_t and the I_s (which is the previous frame?). This could lead into a huge drift over time. But then when I was reading your paper I noticed the similarity matching cost function (Eq 3) tries to match the I_t to the context images I_S (which I guess are all the available training images?). In that case, and for my application, the pose network should be able to estimate the pose of the first frame right after when obscuring tissue is removed. Or is it otherwise and after training the pose network only estimates the pose of the I_t with respect to the I_s?

Regards, Jacob