In this repository, we provide easy interfaces for several exisiting SotA methods to match image feature correspondences between image pairs. We provide scripts to evaluate their predicted correspondences on common benchmarks for the tasks of image matching, homography estimation and visual localization.
Comments from QJ: Currently I am quite busy with my study & work. So it will take some time before I release the next two TODOs.
Sparse Keypoint-based Matching:
Semi-dense Matching:
Supported Evaluations :
The repository is structured as follows:
To use a specific method to perform the matching task, you simply need to do:
import immatch
import yaml
from immatch.utils import plot_matches
with open('configs/patch2pix.yml', 'r') as f: args = yaml.load(f, Loader=yaml.FullLoader)['example'] model = immatch.dictargs['class'] matcher = lambda im1, im2: model.match_pairs(im1, im2)
im1 = 'third_party/patch2pix/examples/images/pair_2/1.jpg' im2 = 'third_party/patch2pix/examples/images/pair_2/2.jpg'
matches, , , _ = matcher(im1, im2)
plot_matches(im1, im2, matches, radius=2, lines=True)
![example matches](docs/patch2pix_example_matches.png)
#### 👉 Try out the code using [example notebook ](notebooks/visualize_matches_on_example_pairs.ipynb).
## Notice
- This repository is expected to be actively maintained (at least before I graduate🤣🤣) and **gradually** (slowly) grow for new features of interest.
- Suggestions regarding how to improve this repo, such as adding new **SotA** image matching methods or new benchmark evaluations, are welcome đź‘Ź.
### Regarding Patch2Pix
With this reprository, one can **reproduce** the tables reported in our paper accepted at CVPR2021: Patch2Pix: Epipolar-Guided Pixel-Level Correspondences[[pdf]](https://arxiv.org/abs/2012.01909). Check [our patch2pix repository](https://github.com/GrumpyZhou/patch2pix) for its training code.
### Disclaimer
- All of the supported methods and evaluations are **not implemented from scratch** by us. Instead, we modularize their original code to define unified interfaces.
- If you are using the results of a method, **remember to cite the corresponding paper**.
- All credits of the implemetation of those methods belong to their authors .