rpartsey / pointgoal-navigation

34 stars 2 forks source link

Settings for reproducing the visual odometry model performance #6

Open Hwiyeon opened 2 years ago

Hwiyeon commented 2 years ago

Hi, thank you for sharing your impressive work!

I'm trying to use your visual odometry model on the habitat simulator (with mp3d, gibson environments), but I'm struggling with reproducing the model performance.

Here are my settings, based on your codes

VO model input

Agent, Sensor settings

I tested the provided pretrained model no. 10 (the biggest model trained on gibson), 11 (mp3d finetuned) on the gibson and mp3d validation dataset with above settings. Unfortunately, I got the total translation MAE of 13cm, rotation mae of 0.075 rad which is far inferior to the supplementary results of the paper. I measured these with 1 step error, not the cumulated error of the trajectory. The qualitative results of the estimated trajectories are also far worse than the results on the paper.

Is there anything wrong with the above settings? Or are there any tips for using your pretrained models? Thank you in advance.

orange2xiaoyu commented 1 year ago

hello, I want to run the agent.py to evaluate agent(navigation policy + visual odometry), but I only find a agent.py in trajectory-sampling/agent/. There is no main function in trajectory-sampling/agent/agent.py. Do you know how to run the evaluation? Thank you very much.