vandal-vpr / vg-transformers

Official Repository of "Learning Sequential Descriptors for Sequence-based Visual Place Recognition "
MIT License
39 stars 5 forks source link

Add PCA #16

Closed Steven-jiaqi closed 1 year ago

Steven-jiaqi commented 1 year ago

Hello,when I run the 'python main_scripts/evaluation.py --pca_outdim 4096 --resume model_path --img_shape 384 384 --trunc_te 8 --freeze_te 1 --arch cct384 --aggregation seqvlad --dataset_path /path/msls_reformat --seq_length 5', the error happened: loading database...: 100%|███████████████████████████████████████| 500/500 [00:00<00:00, 902.17it/s] loading queries...: 100%|███████████████████████████████████████| 368/368 [00:00<00:00, 1335.90it/s] Finding positives and negatives...: 100%|█████████████████████| 8060/8060 [00:06<00:00, 1268.22it/s] 2023-03-23 15:00:51 Test set: < BaseDataset, ' #database: 13584; #queries: 7964 > Loading database to compute PCA...: 100%|████████████████████| 15637/15637 [00:27<00:00, 577.81it/s] 2023-03-23 15:01:20 PCA dataset: < PCADataset, ' #database: 733048 > Database sequence descriptors for PCA: 0%| | 0/2048 [00:00<?, ?it/s] 2023-03-23 15:01:20
Traceback (most recent call last): File "main_scripts/evaluation.py", line 90, in evaluation() File "main_scripts/evaluation.py", line 77, in evaluation pca = compute_pca(args, model, transform, full_features_dim) File "main_scripts/evaluation.py", line 35, in compute_pca for i, sequences in enumerate(tqdm(dl, ncols=100, desc="Database sequence descriptors for PCA: ")): File "/home/user/.conda/envs/vgtransformers/lib/python3.7/site-packages/tqdm/std.py", line 1195, in iter for obj in iterable: File "/home/user/.conda/envs/vgtransformers/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 517, in next data = self._next_data() File "/home/user/.conda/envs/vgtransformers/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 557, in _next_data data = self._dataset_fetcher.fetch(index) # may raise StopIteration File "/home/user/.conda/envs/vgtransformers/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/user/.conda/envs/vgtransformers/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/user/.conda/envs/vgtransformers/lib/python3.7/site-packages/torch/utils/data/dataset.py", line 330, in getitem return self.dataset[self.indices[idx]] File "/home/user/yjq/vg-transformers/tvg/datasets/dataset.py", line 385, in getitem img = torch.stack([self.base_transform(Image.open(path)) for path in self.db_paths[index].split(',')]) File "/home/user/yjq/vg-transformers/tvg/datasets/dataset.py", line 385, in img = torch.stack([self.base_transform(Image.open(path)) for path in self.db_paths[index].split(',')]) File "/home/user/.conda/envs/vgtransformers/lib/python3.7/site-packages/PIL/Image.py", line 2953, in open fp = builtins.open(filename, "rb") FileNotFoundError: [Errno 2] No such file or directory: 'database/NB6QNsyGLNwJXxDwno9Zug/@0415291.14@5001459.84@NB6QNsyGLNwJXxDwno9Zug@263@M_yw5YYeNXpcZQrUJQLiFg@ottawa@20180909@.jpg'

This error report is very strange.Do you know why this is? I'm sorry to bother you so often. And I want to konw how to train only one city in MSLS and test another city. Thank you very much!

ga1i13o commented 1 year ago

Hello, this error is my fault, I made a re factoring on the datasets to make them work with relative paths but forgot to apply it to the PCA dataset. I made a commit and it is fixed now.

To train on one city you can simply pass the arg --city; and it will load that city only. The problem at the moment is that this works only with the train dataset, whereas the val and test are fixed. If you want to select a single city that is from val or test splits, you can simply add cities=city in the val or test dataset declaration.

Whereas if you want to validate or test on a city that is in the train split normally, you have to make a bit of a dirty modification : in the val or test dataset declaration you have to pass split='train' and cities=city. As soon as I have time I will figure out a cleaner way to allow selecting any city in any split

Steven-jiaqi commented 1 year ago

Thank you so much for your prompt response and great work! I was particularly interested in the other results in Table II of the paper, such as: SeqNet [2] ResNet-50 4096 SeqNet [2] CCT224 4096 Can you please provide the corresponding model or code? I'm sorry to bother you again.

ga1i13o commented 1 year ago

Hello, in the SeqNet papers the authors do the following:

Moreover their pre-trained NetVLAD descriptors are obtained with a VGG-16 backbone. In our paper we wanted to provide comparisons with different backbones since we use more modern ones in our method. Therefore, in order to provide these ablations, I have had to:

The whole process was very cumbersome and I do not have a clean code to share; there were also some other problems because we could not train on the whole dataset or it would not converge, and indeed the authors use in their code only a small subset of msls

concluding sadly I cannot help you in reproducing those results but given the hassle and the poor results obtained in the end I suggest you avoid trying

Steven-jiaqi commented 1 year ago

Thank you very much for the reminder!