JiaRenChang / PSMNet

Pyramid Stereo Matching Network (CVPR2018)
MIT License
1.44k stars 423 forks source link

The pretrained KITTI-2015 model #88

Open youmi-zym opened 6 years ago

youmi-zym commented 6 years ago

Hi, thanks for sharing your code. I have some questions need your help. my environment: pytorch 0.4.0 torchvision 0.2.0

I just downloaded your pretrained params and cloned your code, then executed this command python submission.py --datapath [kitti_scene_flow_path]/testing/ --loadmodel pretrained_model_KITTI2015.tar without any fine-tuning. Then, I used the disp_read.m and disp_write.m in matlab code to rewrite the result. Finally, I submit to the KITTI website to evaluate. Blow is the result,

Mine:
All / All   2.36    5.72    2.92
All / Est   2.36    5.72    2.92
Noc / All   2.19    5.42    2.72
Noc / Est   2.19    5.42    2.72

Yours:
All / All   1.86    4.62    2.32
All / Est   1.86    4.62    2.32
Noc / All   1.71    4.31    2.14
Noc / Est   1.71    4.31    2.14

error

You can find that there is 0.6% difference between us. From the result picture, it almost similar, but my error map is shallow than you in color.

I also executed this command python submission.py --datapath [kitti_scene_flow_path]/training/ --loadmodel pretrained_model_KITTI2015.tar and then used the matlab code to evalute the result, I recorded every image's disparity error in exp.txt file. the average disparity error is 0.95%. And I also gave one of them blow. The error map is shallow as the testing result. I think there must be some mistakes made, please check, thanks a lot! exp.txt 000000_10

IceTTTb commented 6 years ago

Hi, @youmi-zym It seems that there is something wrong to your error map. These are two my results on KITTI2015 val set. 1 2

youmi-zym commented 6 years ago

@IceTTTb Thanks for your advice. To be actually, I use the matlab toolkit from the KITTI website. and blow is the main part of the code I have rewritten.

for i = train_val_list
    D_est=disp_read(fullfile(train_root, iids(i).name));
    D_gt=disp_read(fullfile(gt_root, iids(i).name));
    rgb=imread(fullfile(img_root, iids(i).name));
    rgb=double(rgb)/255.0;
    d_err = disp_error(D_gt,D_est,tau);
    D_err = disp_error_image(D_gt,D_est,tau);
    D_est_color=disp_to_color(D_est);
    D_gt_color=disp_to_color(D_gt);
    save_path=fullfile(train_root,['../error/',iids(i).name]);
    imwrite([rgb;disp_to_color([D_est;D_gt]);D_err],save_path);
end

As you can see, disp_to_color([D_est;D_gt]), the max disp to scale the color map is calculated from ground-truth and estimated disparity. Therefore, the processions I have taken are:

  1. download pretrained_model_KITTI2015.tar
  2. execuate submission.py, get the testing and training data result.
  3. use the matlab code from KITTI website, disp_read.m -> disp_write.m to rewrite the result
  4. use the matlab code from KITTI website, demo.m(show above) to get the dispairty map, there is the result of one of example you give 000019_10

I'm wondering are there any errors in submission.py or environment I set or some steps I missed.

IceTTTb commented 6 years ago

@youmi-zym Hi, I guss there may be some errors in the third step in your processions. I feed the results from submission.py to demo.m directly, and the error maps seem right.

youmi-zym commented 6 years ago

@IceTTTb It meas that you got the similar map as I show above? Well, in case there are some mistakes I forgot, I will list the code and environment here:

pytorch 0.4.0 torchvision 0.2.0 python 3.5

disp_read.m

function D = disp_read (filename)
% loads disparity map D from png file
% for details see readme.txt

I = imread(filename);
D = double(I)/256;
D(I==0) = -1;

disp_write.m

function disp_write (D,filename)
% saves disparity map D to png file
% for details see readme.txt

D = double(D);

I = D*256;
I(D==0) = 1;
I(I<0) = 0;
I(I>65535) = 0;
I = uint16(I);
imwrite(I,filename);
passion3394 commented 4 years ago

I have rewritten the disp_to_color.m in Python, welcome issues for somebody else, I will try to make the script running fast.

https://github.com/passion3394/PSMNet_CPU/blob/master/disp_to_color.py