hezhangsprinter / DCPDN

Densely Connected Pyramid Dehazing Network (CVPR'2018)
405 stars 112 forks source link

hi, can you share the train/val/test raw images, better on BaiDuYun, you know, i can not download from google drive in China. thanks a lot. #7

Open ghost opened 6 years ago

hezhangsprinter commented 6 years ago

Sorry, baiduyun account is not easy to have without Chinese Phone number. But you can also generate the sample using 'create_train.py' (Please download the NYU-depth @ http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat)

ghost commented 6 years ago

yes, thanks for quick reply!

3togo commented 6 years ago

I can't download from below: http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat any other alternative link available?

zhaoxin111 commented 6 years ago

I can't download from below: http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat any other alternative link available?

用迅雷

3togo commented 6 years ago

will it be too big for baiduyun?下午5:22, 2018年9月18日, Matrix notifications@github.com: I can't download from below: http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat any other alternative link available?

用迅雷

—You are receiving this because you commented.Reply to this email directly, view it on GitHub, or mute the thread.

-- Sent from Yandex.Mail for mobile

zhaoxin111 commented 6 years ago

will it be too big for baiduyun?下午5:22, 2018年9月18日, Matrix notifications@github.com: I can't download from below: http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat any other alternative link available? 用迅雷 —You are receiving this because you commented.Reply to this email directly, view it on GitHub, or mute the thread. -- Sent from Yandex.Mail for mobile

2.8G

noobgrow commented 5 years ago

请问有没有下载成功的,能不能把训练集上传百度云啊,下不下来,总会断。迅雷也不行。

shawnyuen commented 5 years ago

请问有没有下载成功的,能不能把训练集上传百度云啊,下不下来,总会断。迅雷也不行。

download 'nyu_depth_v2_labeled.mat' via wget. The training set is so large (~80GB), generating training and validation set via 'create_train.py' is fast.

AdamJupiter commented 4 years ago

Does anybody run the code successfully? I am trying to run it, but I encountere a lot of errors. Can anyboy help me? 有人弄这个弄成功了的么?可以分享一下经验么。