ais-lab / d2s

[RAL 2024] D2S: Representing sparse descriptors and 3D coordinates for camera relocalization
https://thpjp.github.io/d2s/
Apache License 2.0
65 stars 5 forks source link

Hello,I have some problem about training my own data #14

Closed luocha0107 closed 2 months ago

luocha0107 commented 2 months ago

I trained the Cambridge data set following the instructions in the readme. And now I want to train it with my own data,It is convenient to query the pose of the image later, but I am a little confused about the preparation of the dataset, and please give me an answer.

now,I have hundreds of images, and I have colmap data for those images, including camera pose, internal parameters, and sparse point clouds and.db data. What can I do to meet the requirements of the running pipeline to train and test?

Thank you for your work and dedication,waitting for your reply.

thuanaislab commented 2 months ago

if your data is not big, can you give it to me? Then I will make an instant implementation for custom data. Then you can use it easier. Please note that you need to have a list_test.txt which lists all test images.

thuanaislab commented 2 months ago

I just committed an additional preprocess with custom data as your description here: processing/preprocess_custom_data.py Could you please test it on your own? I just implemented it and haven't tested it with any data yet :) but I think it would work. If there are any errors you can't fix, please raise them here.

luocha0107 commented 2 months ago

I just committed an additional preprocess with custom data as your description here: processing/preprocess_custom_data.py Could you please test it on your own? I just implemented it and haven't tested it with any data yet :) but I think it would work. If there are any errors you can't fix, please raise them here.

Ok, I was just about to send you a message. I'll test preprocess_custom_data.py right away

luocha0107 commented 2 months ago

I just committed an additional preprocess with custom data as your description here: processing/preprocess_custom_data.py Could you please test it on your own? I just implemented it and haven't tested it with any data yet :) but I think it would work. If there are any errors you can't fix, please raise them here.

I'm having problems with missing files: Traceback (most recent call last): File "/root/data_user/ysl/feat2map/processing/preprocess_custom_data.py", line 133, in main(sys.argv) File "/root/data_user/ysl/feat2map/processing/preprocess_custom_data.py", line 129, in main preprocessing(args.hloc_out_dir, args.list_test, args.out_dir, args.dataset, args.scene) File "/root/data_user/ysl/feat2map/processing/preprocess_custom_data.py", line 40, in preprocessing features = h5py.File(osp.join(hloc_out_dir, "feats-superpoint-n4096-r1024.h5"), 'r') File "/root/miniconda3/envs/gs_model/lib/python3.9/site-packages/h5py/_hl/files.py", line 562, in init fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr) File "/root/miniconda3/envs/gs_model/lib/python3.9/site-packages/h5py/_hl/files.py", line 235, in make_fid fid = h5f.open(name, flags, fapl=fapl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5f.pyx", line 102, in h5py.h5f.open FileNotFoundError: [Errno 2] Unable to synchronously open file (unable to open file: name = '../mydata/powertower_0729/feats-superpoint-n4096-r1024.h5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

My data structure is as follows: Weixin Screenshot_20240821165019

Do I need to run hloc to get this file and can my data run hloc?

thuanaislab commented 2 months ago

Yeah I forgot to mention that part. You need to run Hloc to get those files before you can run the processing/preprocess_custom_data.py

luocha0107 commented 2 months ago

Yeah I forgot to mention that part. You need to run Hloc to get those files before you can run the processing/preprocess_custom_data.py

OK, i see. Thank you so much. You have helped me a lot!