cvg / Hierarchical-Localization

Visual localization made easy with hloc
Apache License 2.0
3.24k stars 601 forks source link

Failed to operate Aachen pipeline. #266

Closed h3fx111 closed 10 months ago

h3fx111 commented 1 year ago

When I use the command “python -m hloc.pipelines.Aachen.pipeline --outputs ./outputs/aachen”, the program eventually prints “Loaded SuperPoint Model” and stops here. I cannot terminate the program by using “ctrl+c”. Why is that? Could you help me with it? ![Uploading bug.png…]()

vestri commented 1 year ago

Hi, I have installed hloc and pycolmap in windows10 with conda-forge and Python 3.7 (same problem in python 3.9) but I could not made Aachen pipeline exemple working. As @h3fx111 it starts but crash after "Loaded SuperPoint lodel".

I tracked the problem and it seems to me that the module is called multiple times which made it crashing. Next is the log, I removed the configuration prints for clarity and add "logger.info('Start Aachen/Pipeline.py')" at the start of the module. this line is called twice which made a fork conflict.

Am I misusing the example or is it a problem on windows? Does someone made it working on window? Thanks

`(hloc) PS G:\Hierarchical-Localization> python.exe -m hloc.pipelines.Aachen.pipeline [2023/10/19 17:57:21 hloc INFO] Start Aachen/Pipeline.py [2023/10/19 17:57:21 hloc INFO] start configurations for extraction and matching [2023/10/19 17:57:21 hloc INFO] End retrieval_conf [2023/10/19 17:57:21 hloc INFO] End feature_conf [2023/10/19 17:57:21 hloc INFO] End matcher_conf [2023/10/19 17:57:21 hloc INFO] Extracting local features with configuration: {'model': {'max_keypoints': 4096, 'name': 'superpoint', 'nms_radius': 3}, 'output': 'feats-superpoint-n4096-r1024', 'preprocessing': {'grayscale': True, 'resize_max': 1024}} [2023/10/19 17:57:21 hloc INFO] start ImageDataset [2023/10/19 17:57:21 hloc INFO] Found 2464 images in root datasets\aachen\images_upright. Loaded SuperPoint model [2023/10/19 17:57:21 hloc INFO] Start torch.utils.data.DataLoader [2023/10/19 17:57:21 hloc INFO] End torch.utils.data.DataLoader 0%| | 0/2464 [00:00<?, ?it/s][2023/10/19 17:57:24 hloc INFO] Start Aachen/Pipeline.py [2023/10/19 17:57:24 hloc INFO] start configurations for extraction and matching [2023/10/19 17:57:24 hloc INFO] End retrieval_conf [2023/10/19 17:57:24 hloc INFO] End feature_conf [2023/10/19 17:57:24 hloc INFO] End matcher_conf [2023/10/19 17:57:24 hloc INFO] Extracting local features with configuration: {'model': {'max_keypoints': 4096, 'name': 'superpoint', 'nms_radius': 3}, 'output': 'feats-superpoint-n4096-r1024', 'preprocessing': {'grayscale': True, 'resize_max': 1024}} [2023/10/19 17:57:24 hloc INFO] start ImageDataset [2023/10/19 17:57:24 hloc INFO] Found 2464 images in root datasets\aachen\images_upright. Loaded SuperPoint model [2023/10/19 17:57:24 hloc INFO] Start torch.utils.data.DataLoader [2023/10/19 17:57:24 hloc INFO] End torch.utils.data.DataLoader 0%| | 0/2464 [00:00<?, ?it/s] Traceback (most recent call last): File "", line 1, in File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\spawn.py", line 116, in spawn_main exitcode = _main(fd, parent_sentinel) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\spawn.py", line 125, in _main prepare(preparation_data) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\spawn.py", line 234, in prepare _fixup_main_from_name(data['init_main_from_name']) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\spawn.py", line 258, in _fixup_main_from_name main_content = runpy.run_module(mod_name, File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 225, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 97, in _run_module_code _run_code(code, mod_globals, init_globals, File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\pipelines\Aachen\pipeline.py", line 48, in features = extract_features.main(feature_conf, images, outputs) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\extract_features.py", line 260, in main for idx, data in enumerate(tqdm(loader)): File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\tqdm\std.py", line 1182, in iter for obj in iterable: File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data\dataloader.py", line 438, in iter return self._get_iterator() File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data\dataloader.py", line 386, in _get_iterator return _MultiProcessingDataLoaderIter(self) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data\dataloader.py", line 1039, in init w.start() File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\process.py", line 121, in start self._popen = self._Popen(self) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\context.py", line 327, in _Popen return Popen(process_obj) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\popen_spawn_win32.py", line 45, in init prep_data = spawn.get_preparation_data(process_obj._name) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\spawn.py", line 154, in get_preparation_data _check_not_importing_main() File "C:\Users\cvestri\miniconda3\envs\hloc\lib\multiprocessing\spawn.py", line 134, in _check_not_importing_main raise RuntimeError(''' RuntimeError: An attempt has been made to start a new process before the current process has finished its bootstrapping phase.

    This probably means that you are not using fork to start your
    child processes and you have forgotten to use the proper idiom
    in the main module:

        if __name__ == '__main__':
            freeze_support()
            ...

    The "freeze_support()" line can be omitted if the program
    is not going to be frozen to produce an executable.

`

sarlinpe commented 1 year ago

Thanks for reporting this. I don't have a Windows machine so I never checked whether it works. It appears Windows is picky about multiprocessing. 2 options to make it work:

  1. Try out PR https://github.com/cvg/Hierarchical-Localization/pull/323
  2. Set num_workers=0 in https://github.com/cvg/Hierarchical-Localization/blob/8eb9977f1d2b0087bed4666ee83040049e921b10/hloc/extract_features.py#L256 and https://github.com/cvg/Hierarchical-Localization/blob/8eb9977f1d2b0087bed4666ee83040049e921b10/hloc/match_features.py#L230
vestri commented 1 year ago

Thanks @sarlinpe, Sorry I had to wait for dowloading the resting GB of aachen dataset that were missing. It worked but It crash later. I put num_worker=0 in the two calls as you suggested.

Here is the new log:

[2023/10/20 18:25:39 hloc INFO] Start Aachen/Pipeline.py [2023/10/20 18:25:39 hloc INFO] start configurations for extraction and matching [2023/10/20 18:25:39 hloc INFO] End retrieval_conf [2023/10/20 18:25:39 hloc INFO] End feature_conf [2023/10/20 18:25:39 hloc INFO] End matcher_conf [2023/10/20 18:25:39 hloc INFO] Extracting local features with configuration: {'model': {'max_keypoints': 4096, 'name': 'superpoint', 'nms_radius': 3}, 'output': 'feats-superpoint-n4096-r1024', 'preprocessing': {'grayscale': True, 'resize_max': 1024}} [2023/10/20 18:25:39 hloc INFO] start ImageDataset [2023/10/20 18:25:39 hloc INFO] Found 2464 images in root datasets\aachen\images_upright. [2023/10/20 18:25:42 hloc INFO] Skipping the extraction. [2023/10/20 18:25:42 hloc INFO] End extract_features.main [2023/10/20 18:25:42 hloc INFO] Found 13026 images and 13026 cameras in database. [2023/10/20 18:25:42 hloc INFO] Reading the NVM model... [2023/10/20 18:25:42 hloc INFO] Reading 4328 cameras... [2023/10/20 18:25:42 hloc INFO] Reading 4328 images... [2023/10/20 18:25:42 hloc INFO] Reading 1652687 points... 100%|███████████████████████████████████████████████████████████████████████████████████| 1652687/1652687 [00:40<00:00, 40368.54pts/s] [2023/10/20 18:26:23 hloc INFO] Parsing image data... [2023/10/20 18:26:47 hloc INFO] Writing the COLMAP model... [2023/10/20 18:28:57 hloc INFO] Done. [2023/10/20 18:28:59 hloc INFO] End colmap_from_nvm.main - wait [2023/10/20 18:29:09 hloc INFO] Reading the COLMAP model... [2023/10/20 18:29:54 hloc INFO] Extracting image pairs from covisibility info... 100%|█████████████████████████████████████████████████████████████████████████████████████████████| 4328/4328 [01:03<00:00, 68.69it/s] [2023/10/20 18:30:57 hloc INFO] Found 85515 pairs. [2023/10/20 18:30:59 hloc INFO] End pairs_from_covisibility.main - wait [2023/10/20 18:31:09 hloc INFO] Matching local features with configuration: {'model': {'name': 'superglue', 'sinkhorn_iterations': 50, 'weights': 'outdoor'}, 'output': 'matches-superglue'} Loaded SuperGlue model ("outdoor" weights) 0%| | 0/56717 [00:00<?, ?it/s] Traceback (most recent call last): File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\pipelines\Aachen\pipeline.py", line 64, in sfm_matches = match_features.main( File "G:\RDVision\Micado\Hierarchical-Localization\hloc\match_features.py", line 174, in main match_from_paths(conf, pairs, matches, features_q, features_ref, overwrite) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\match_features.py", line 233, in match_from_paths for idx, data in enumerate(tqdm(loader, smoothing=.1)): File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\tqdm\std.py", line 1182, in iter for obj in iterable: File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data\dataloader.py", line 630, in next data = self._next_data() File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data\dataloader.py", line 674, in _next_data data = self._dataset_fetcher.fetch(index) # may raise StopIteration File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data_utils\fetch.py", line 51, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data_utils\fetch.py", line 51, in data = [self.dataset[idx] for idx in possibly_batched_index] File "G:\RDVision\Micado\Hierarchical-Localization\hloc\match_features.py", line 122, in getitem grp = fd[name0] File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\h5py_hl\group.py", line 357, in getitem oid = h5o.open(self.id, self._e(name), lapl=self._lapl) File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py\h5o.pyx", line 189, in h5py.h5o.open KeyError: 'Unable to synchronously open object (component not found)'

sarlinpe commented 1 year ago

[2023/10/20 18:25:39 hloc INFO] Found 2464 images in root datasets\aachen\images_upright.

This is incorrect, the dataset has 5426 images. Are you sure that you downloaded the images fully and successfully? On Unix I'd expect the following command to return 5426:

$ find datasets/aachen/images_upright -type f -name "*.jpg" | wc -l
vestri commented 1 year ago

Ok thanks, I only downloaded and decompressed aachen_v1_1.zip, I added this morning the other data and relaunched. It is not finished but it passed this step and I have the same number of images. Next one seems much longer, i will keep you updated

2023/10/23 15:27:27 hloc INFO] 5424query/night/nexus5x/IMG_20161227_192344.jpg 100%|██████████████████████████████████████████████████████████████████████████████████████████▉| 5425/5426 [5:23:05<00:07, 7.18s/it][2023/10/23 15:27:36 hloc INFO] 5425query/night/nexus5x/IMG_20161227_192354.jpg 100%|███████████████████████████████████████████████████████████████████████████████████████████| 5426/5426 [5:23:14<00:00, 3.57s/it] [2023/10/23 15:27:46 hloc INFO] Finished exporting features. [2023/10/23 15:27:46 hloc INFO] End extract_features.main [2023/10/23 15:27:46 hloc INFO] Found 13026 images and 13026 cameras in database. [2023/10/23 15:27:46 hloc INFO] Reading the NVM model... [2023/10/23 15:27:46 hloc INFO] Reading 4328 cameras... [2023/10/23 15:27:46 hloc INFO] Reading 4328 images... [2023/10/23 15:27:46 hloc INFO] Reading 1652687 points... 100%|███████████████████████████████████████████████████████████████████████████████████| 1652687/1652687 [01:44<00:00, 15761.16pts/s] [2023/10/23 15:29:31 hloc INFO] Parsing image data... [2023/10/23 15:30:27 hloc INFO] Writing the COLMAP model... [2023/10/23 15:34:19 hloc INFO] Done. [2023/10/23 15:34:22 hloc INFO] End colmap_from_nvm.main - wait [2023/10/23 15:34:32 hloc INFO] Reading the COLMAP model... [2023/10/23 15:35:59 hloc INFO] Extracting image pairs from covisibility info... 100%|█████████████████████████████████████████████████████████████████████████████████████████████| 4328/4328 [01:38<00:00, 43.91it/s] [2023/10/23 15:37:38 hloc INFO] Found 85515 pairs. [2023/10/23 15:37:40 hloc INFO] End pairs_from_covisibility.main - wait [2023/10/23 15:37:50 hloc INFO] Matching local features with configuration: {'model': {'name': 'superglue', 'sinkhorn_iterations': 50, 'weights': 'outdoor'}, 'output': 'matches-superglue'} Loaded SuperGlue model ("outdoor" weights) 0%|▎ | 166/56717 [1:16:05<405:14:23, 25.80s/it]

sarlinpe commented 1 year ago

25.80s/it is extremely slow, on a decent GPU you should expect around 20 FPS - unless you are running this on CPU?

vestri commented 1 year ago

Yes it is very slow, I installed torch with CUDA. -> get between 1.2s/it to 1.5s/it, much better. Thanks

vestri commented 1 year ago

Hi, I am still on it but process is not finished. I had two other errors that I corrected by adding some python lines. I explained this next. It works but I am not certain that it is the correct solution.

First error

[2023/10/25 09:35:46 hloc INFO] Skipping the matching. [2023/10/25 09:35:46 hloc INFO] End match_features.main - wait [2023/10/25 09:36:10 hloc WARNING] The database already exists, deleting it. [2023/10/25 09:36:10 hloc INFO] Importing features into the database... 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4328/4328 [00:09<00:00, 447.21it/s] [2023/10/25 09:36:20 hloc INFO] Importing matches into the database... 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████| 85515/85515 [02:08<00:00, 667.00it/s] [2023/10/25 09:38:29 hloc INFO] Performing geometric verification of the matches... 0%| | 0/4328 [00:00<?, ?it/s]OMP: Error #15: Initializing libiomp5md.dll, but found libiomp5md.dll already initialized. OMP: Hint This means that multiple copies of the OpenMP runtime have been linked into the program. That is dangerous, since it can degrade performance or cause incorrect results. The best thing to do is to ensure that only a single OpenMP runtime is linked into the process, e.g. by avoiding static linking of the OpenMP runtime in any library. As an unsafe, unsupported, undocumented workaround you can set the environment variable KMP_DUPLICATE_LIB_OK=TRUE to allow the program to continue to execute, but that may cause crashes or silently produce incorrect results. For more information, please see http://www.intel.com/software/products/support/.

First error was an OpenMP crash because libiomp5md was initialized several times. Looking on internet, it generally appends when OpenMP is linked statically. I solved it by adding those two lines:

import os os.environ['KMP_DUPLICATE_LIB_OK']='True'

second error

Traceback (most recent call last): File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\pipelines\Aachen\pipeline.py", line 79, in global_descriptors = extract_features.main(retrieval_conf, images, outputs) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, *kwargs) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\extract_features.py", line 254, in main model = Model(conf['model']).eval().to(device) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\utils\base_model.py", line 17, in init self._init(conf) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\extractors\netvlad.py", line 67, in _init torch.hub.download_url_to_file(url, checkpoint_path) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\hub.py", line 620, in download_url_to_file u = urlopen(req) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 214, in urlopen return opener.open(url, data, timeout) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 523, in open response = meth(req, response) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 632, in http_response response = self.parent.error( File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 555, in error result = self._call_chain(args) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 494, in _call_chain result = func(args) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 747, in http_error_302 return self.parent.open(new, timeout=req.timeout) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 517, in open response = self._open(req, data) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 534, in _open result = self._call_chain(self.handle_open, protocol, protocol + File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 494, in _call_chain result = func(args) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 1389, in https_open return self.do_open(http.client.HTTPSConnection, req, File "C:\Users\cvestri\miniconda3\envs\hloc\lib\urllib\request.py", line 1349, in do_open raise URLError(err) urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1129)>

second error is a SSL error. I tried to replace https by http in the code but it did not work. I solved it by adding those two lines of code:

import ssl ssl._create_default_https_context = ssl._create_unverified_context

current status

2023/10/25 19:19:10 hloc INFO] 7887sequences/nexus4_sequences/sequence_8/aachen_nexus4_seq8_0069.png 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████▉| 7888/7890 [2:07:46<00:01, 1.17it/s][2023/10/25 19:19:11 hloc INFO] 7888sequences/nexus4_sequences/sequence_8/aachen_nexus4_seq8_0070.png 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████▉| 7889/7890 [2:07:47<00:00, 1.18it/s][2023/10/25 19:19:12 hloc INFO] 7889sequences/nexus4_sequences/sequence_8/aachen_nexus4_seq8_0071.png 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████| 7890/7890 [2:07:48<00:00, 1.03it/s] [2023/10/25 19:19:12 hloc INFO] Finished exporting features. [2023/10/25 19:19:13 hloc INFO] Extracting image pairs from a retrieval database. [2023/10/25 19:19:31 hloc INFO] Found 52000 pairs. [2023/10/25 19:19:31 hloc INFO] Matching local features with configuration: {'model': {'name': 'superglue', 'sinkhorn_iterations': 50, 'weights': 'outdoor'}, 'output': 'matches-superglue'} Loaded SuperGlue model ("outdoor" weights) 25%|█████████████████████████▊ | 13033/52000 [14:21:04<44:49:45, 4.14s/it]

It is again very slow. I use Torch with CUDA, is there some other library I could used with CUDA for acceleration? I am using pycolmap from conda forge: https://anaconda.org/conda-forge/pycolmap I did not find any windows version with CUDA. Thanks

sarlinpe commented 1 year ago

You can increase the speed at the cost of some accuracy with the following changes:

  1. Match with superglue-fast or superpoint+lightglue: https://github.com/cvg/Hierarchical-Localization/blob/6fdeacb3f83f41a196a1eb1f8673939646c6f8e4/hloc/pipelines/Aachen/pipeline.py#L29
  2. Use fewer keypoints by changing max_keypoints to 1024: https://github.com/cvg/Hierarchical-Localization/blob/8eb9977f1d2b0087bed4666ee83040049e921b10/hloc/extract_features.py#L29-L40
vestri commented 1 year ago

Ok thanks, I used your parameters for acceleration. And I finished it :-)

[2023/10/27 10:27:50 hloc INFO] Localized 1015 / 1015 images. [2023/10/27 10:27:50 hloc INFO] Writing poses to outputs\aachen\Aachen_hloc_superpoint+superglue_netvlad50.txt... [2023/10/27 10:27:50 hloc INFO] Writing logs to outputs\aachen\Aachen_hloc_superpoint+superglue_netvlad50.txt_logs.pkl... [2023/10/27 10:28:16 hloc INFO] Done!

I tried then to vizualize the results (in 3D), it seems not easy once everything is computed, I don't know how to use hloc/visualization.py

Anyway thanks for your help and nice work. I am also wondering if you are still working on colam (https://kaikai23.github.io/3dv.html) and if code will be available.

JiajieLi7012 commented 4 months ago

Thanks @sarlinpe, Sorry I had to wait for dowloading the resting GB of aachen dataset that were missing. It worked but It crash later. I put num_worker=0 in the two calls as you suggested.

Here is the new log:

[2023/10/20 18:25:39 hloc INFO] Start Aachen/Pipeline.py [2023/10/20 18:25:39 hloc INFO] start configurations for extraction and matching [2023/10/20 18:25:39 hloc INFO] End retrieval_conf [2023/10/20 18:25:39 hloc INFO] End feature_conf [2023/10/20 18:25:39 hloc INFO] End matcher_conf [2023/10/20 18:25:39 hloc INFO] Extracting local features with configuration: {'model': {'max_keypoints': 4096, 'name': 'superpoint', 'nms_radius': 3}, 'output': 'feats-superpoint-n4096-r1024', 'preprocessing': {'grayscale': True, 'resize_max': 1024}} [2023/10/20 18:25:39 hloc INFO] start ImageDataset [2023/10/20 18:25:39 hloc INFO] Found 2464 images in root datasets\aachen\images_upright. [2023/10/20 18:25:42 hloc INFO] Skipping the extraction. [2023/10/20 18:25:42 hloc INFO] End extract_features.main [2023/10/20 18:25:42 hloc INFO] Found 13026 images and 13026 cameras in database. [2023/10/20 18:25:42 hloc INFO] Reading the NVM model... [2023/10/20 18:25:42 hloc INFO] Reading 4328 cameras... [2023/10/20 18:25:42 hloc INFO] Reading 4328 images... [2023/10/20 18:25:42 hloc INFO] Reading 1652687 points... 100%|███████████████████████████████████████████████████████████████████████████████████| 1652687/1652687 [00:40<00:00, 40368.54pts/s] [2023/10/20 18:26:23 hloc INFO] Parsing image data... [2023/10/20 18:26:47 hloc INFO] Writing the COLMAP model... [2023/10/20 18:28:57 hloc INFO] Done. [2023/10/20 18:28:59 hloc INFO] End colmap_from_nvm.main - wait [2023/10/20 18:29:09 hloc INFO] Reading the COLMAP model... [2023/10/20 18:29:54 hloc INFO] Extracting image pairs from covisibility info... 100%|█████████████████████████████████████████████████████████████████████████████████████████████| 4328/4328 [01:03<00:00, 68.69it/s] [2023/10/20 18:30:57 hloc INFO] Found 85515 pairs. [2023/10/20 18:30:59 hloc INFO] End pairs_from_covisibility.main - wait [2023/10/20 18:31:09 hloc INFO] Matching local features with configuration: {'model': {'name': 'superglue', 'sinkhorn_iterations': 50, 'weights': 'outdoor'}, 'output': 'matches-superglue'} Loaded SuperGlue model ("outdoor" weights) 0%| | 0/56717 [00:00<?, ?it/s] Traceback (most recent call last): File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\cvestri\miniconda3\envs\hloc\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\pipelines\Aachen\pipeline.py", line 64, in sfm_matches = match_features.main( File "G:\RDVision\Micado\Hierarchical-Localization\hloc\match_features.py", line 174, in main match_from_paths(conf, pairs, matches, features_q, features_ref, overwrite) File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, kwargs) File "G:\RDVision\Micado\Hierarchical-Localization\hloc\match_features.py", line 233, in match_from_paths for idx, data in enumerate(tqdm(loader, smoothing=.1)): File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\tqdm\std.py", line 1182, in iter for obj in iterable: File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data\dataloader.py", line 630, in next data = self._next_data() File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data\dataloader.py", line 674, in _next_data data = self._dataset_fetcher.fetch(index) # may raise StopIteration File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data_utils\fetch.py", line 51, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\torch\utils\data_utils\fetch.py", line 51, in data = [self.dataset[idx] for idx in possibly_batched_index] File "G:\RDVision\Micado\Hierarchical-Localization\hloc\match_features.py", line 122, in getitem grp = fd[name0] File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "C:\Users\cvestri\miniconda3\envs\hloc\lib\site-packages\h5py_hl\group.py", line 357, in getitem** oid = h5o.open(self.id, self._e(name), lapl=self._lapl) File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py\h5o.pyx", line 189, in h5py.h5o.open KeyError: 'Unable to synchronously open object (component not found)'

Hi I'm trying to use hloc on my custom data but also got the error: KeyError: 'Unable to synchronously open object (component not found)'. Could you please tell me how you solved this problem? Thanks in advance.