ClementPinard / SfmLearner-Pytorch

Pytorch version of SfmLearner from Tinghui Zhou et al.
MIT License
1.01k stars 224 forks source link

Large Errors on Pose Prediction Network #134

Closed micat001 closed 2 years ago

micat001 commented 2 years ago

Hi - I'm trying to generate pose predictions using either of the provided models. Unfortunately, running test_pose.py from a clean install seems to generate much larger errors than you're seeing.

Regardless of the torch/numpy version (new, using requirements.txt or forcing old versions per issue 23) I get huge errors:

Sequence 09 Sequence 10
ATE Best 0.0659 (0.0234) 0.0496 (std. 0.0250)
ATE Checkpoint 0.0411 (0.215) 0.0274 (std. 0.0159)
RE Best 0.0258 (0.0145) 0.0244 (std. 0.0152)
RE Checkpoint 0.0187 (0.0144) 0.0186 (std. 0.0139)

Is this reproducible by others? It seems like maybe something is incorrect in my odometry setup, as that's really the only part not cribbed directly from this repo. I can reproduce the ground truth trajectories for the two sequences correctly though. So the ground truth odometry seems accurate.

micat001 commented 2 years ago

I mostly got this ironed out. It seemed to be related the resize and image reading utilities used as the original imports from scipy.misc are deprecated.

I replaced all import of scipy.misc.imresize with from skimage.transform import resize as imresize and all imports of scipy.misc.imread with from imageio import imread.

For Sequence 10:

Sequence ATE RE
10 (My repo copy) 0.0148 (std 0.0096) 0.0042 (std 0.0027)
10 (Repo Results) 0.0141 (std 0.0115) 0.0018 (std 0.0011)

Still differences, but not so large.

I'm using Python 3.8, Torch 1.11 for reference but was able to achieve equivalent results on Python 3.9. I'll go ahead and close this issue.

ClementPinard commented 2 years ago

Hi,

sorry for the lack of response. Glad you could sort it out. Care to do a PR to solve the problem ?

Otherwie I'll try to rectify it myself in a few days.

Clément

micat001 commented 2 years ago

I've written up my changes in a PR here: #138