Open yuanpengtu opened 2 years ago
I also have this question.
Thanks very much for your attention. I am very sorry. Because we should apply authorization for these datasets from the authors, I do not think it would be proper to share the datasets online. These datasets could access through the following websites. You could download the raw data and train with the corresponding backbones (image backbone are the pre-trained models given by the torchvision, and the pre-trained doc2vec model is available from https://github.com/jhlau/doc2vec). Hope this helps.
Note that, If the link is invalid, you could try to contact the original authors.
INRIA-Websearch: https://weiyc.github.io/assets/projects/tcyb.html NUS-WIDE: https://lms.comp.nus.edu.sg/wp-content/uploads/2019/research/nuswide/NUS-WIDE.html XmediaNet: http://59.108.48.34/tiki/XMediaNet/ The list for nus_wide_deep_doc2vec_data_42941: https://drive.google.com/file/d/15xm6SwwdoLhMPupwheWopb3c5ryCp06j/view?usp=sharing
Best regards, Hu, Peng
Could you please contact the original authors of the datasets for these problems? They could fix these invalid URLs. Thanks a lot.
mengmengsheng @.***> 于2022年7月3日周日 10:15写道:
I also has this question.
— Reply to this email directly, view it on GitHub https://github.com/penghu-cs/MRL/issues/8#issuecomment-1172997308, or unsubscribe https://github.com/notifications/unsubscribe-auth/AK5KOI6MKQXBDG6P4YLDFELVSDZU7ANCNFSM5ZMCWUBA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Hi, it seems that the format of these datasets is not consistent with the original data, so could you please send me a copy of the processed data (including the nus_wide_deep_doc2vec_data_42941.h5py and xmedianet_deep_doc2vec_data.h5py)? Thanks!