Closed github-yizhang closed 1 year ago
Hi @github-yizhang,
- I think there is a spelling mistake in ./doc/usage/dataset.md, when download Scannet dataset, update the path in ./etc/datasets/image-net/{training-all,test}.yaml. should be corrected.
Thanks a lot for pointing that out. It will be fixed soon (c.f. https://github.com/facebookresearch/silk/pull/46/commits/bb37262615e804f32b17a4a8b84d7072b41edc28).
- After I set up my python environment, I test my environment using "./bin/run_tests", but some mistake are reported as [...]
That unittest tests the LoFTR's version of the ScanNet dataset class. It will only works on the FAIR internal infrastructure (because of the hardcoded paths). Since we don't use that version to train or test SiLK, you can simply comment or remove that test file (I might remove it in a following commit as well). Was that the only unit test that failed ?
- I am trying to train SiLK on Scannet dataset, some mistake still happen, As follows
For ScanNet only, there are two places (c.f. here and below) that should be uncommented to make it work. Thanks for pointing that out, I'll add it to the documentation as well.
Thank you for your reply, it solved all the issues I was facing.
Thanks for your wonderful work on keypoint. I am trying to follow your work, but some issues about scannet bother me. 1. I think there is a spelling mistake in ./doc/usage/dataset.md, when download Scannet dataset, update the path in
./etc/datasets/image-net/{training-all,test}.yaml
. should be corrected. 2. After I set up my python environment, I test my environment using "./bin/run_tests", but some mistake are reported asI am wondering do I need to do some pre-processing on scannet first,I download scannetv2 and use its raw data.
3. I am trying to train SiLK on Scannet dataset, some mistake still happen, As follows
In conclusion, it seems the way I use scannet dataset is wrong. It would be extremely grateful if you can provide me the detail between downloading scannet and training on scannet. Thank you again for your devotion to this outstanding work, and I look forward to your early reply.