Closed kaiyi98 closed 2 years ago
In main_scripts/msls/1_reformat_mapillary.py 11th lines, the 'from tvg.utils import get_all_datasets_path' is null because I can not find the get_all_dataset_path file.
Hello, thank you for your interest.
The .secret file is a file that should contain the credentials for the Robotcar website and be formatted in the following way : username,password
The credentials can be freely obtained from the Robotcar official website.
Regarding the get_all_datasets_path I thank you for spotting the mistake, however you simply need to manually a set an origin and destination folder for the original mapillary datasets, and the formatted version. I just pushed changes to make this clearer in the code and remove the non-existing import
Hello, thank you for your interest. The .secret file is a file that should contain the credentials for the Robotcar website and be formatted in the following way :
username,password
The credentials can be freely obtained from the Robotcar official website.Regarding the get_all_datasets_path I thank you for spotting the mistake, however you simply need to manually a set an origin and destination folder for the original mapillary datasets, and the formatted version. I just pushed changes to make this clearer in the code and remove the non-existing import
sorry,I can't find th .secret file on the Robotcar official website. Can you give me the link?
Hello, the Robotcar website is at https://robotcar-dataset.robots.ox.ac.uk/ Regarding the .secret file, it does not come from the website, it's just a "quick-and-dirty" solution that I adopted to automatize downloading. You register on the website, then manually create this file with the simple format username,password. So the script will read the file extract credentials and start downloading
Hello, I have some questions for the pooling.py code. In the 247 line, 'outputs = encoder(inputs)', the 'outputs' is [B,token_nums,D] and then 'norm_outputs = F.normalize(outputs, p=2, dim=1)', why the dim=1 rather dim=2? And in the 'image_descriptors = norm_outputs.view(norm_outputs.shape[0], features_dim, -1).permute(0, 2, 1)', the view() function will confuse the token features, although the image_descriptors.shape==outputs.shape, it can't describe the tokens.
Hello, I have encountered some questions for the 1_downloader.py. I have noticed the download server of oxford robotcar dataste is currently unavailable. So can you share another link(involved pre-processed oxford robotcar dataset) for me?
@kaiyi98: regarding your last comment: you are absolutely right, thanks for pointing it out! For ViT and CCT, the normalization should be on each token, rather than on the first dimension. I have run a couple of experiments and this fix provides a +1.5/2 points of recall boost. I will push a small fix for this
@kaiyi98 and @wangzhuo2333 : since you both asked about the Robotcar Dataset. I am aware that the usage of all the pre-processing scripts is a bit tricky. I have done some searches and it seems that the license with which the dataset was released allows me to share it. I will upload the already processed split as we used them in our paper, and I will update the README with a link to it.
Thanks to you both for your interest!
@kaiyi98 and @wangzhuo2333: I have added in the README-section Datasets links to directly download pre-processed Robotcar
@kaiyi98 and @wangzhuo2333: I have added in the README-section Datasets links to directly download pre-processed Robotcar
Thank you so much!Robotcar download server have restored @wangzhuo2333 . But I have a new problem, the model pre-trained weight can't been loaded in cct.py model_urls.
Hello, i pushed some changes:
In main_scripts/msls/1_reformat_mapillary.py 11th lines, the 'from tvg.utils import get_all_datasets_path' is null because I can not find the get_all_dataset_path file.
I am sorry to bother you, I would like to know how this problem is solved yet. I have the same problem, does 'DS_path' require us to assign a specific value?
What is the .secret file in the 45th lines in main_scripts/robotcar/1_downloader.py. Thank you!