GrumpyZhou / image-matching-toolbox

This is a toolbox repository to help evaluate various methods that perform image matching from a pair of images.
MIT License
534 stars 79 forks source link
evaluation image-correspondences image-matching

A Toolbox for Image Feature Matching and Evaluations

In this repository, we provide easy interfaces for several exisiting SotA methods to match image feature correspondences between image pairs. We provide scripts to evaluate their predicted correspondences on common benchmarks for the tasks of image matching, homography estimation and visual localization.

TODOs & Updates

Comments from QJ: Currently I am quite busy with my study & work. So it will take some time before I release the next two TODOs.

Supported Methods & Evaluations

Sparse Keypoint-based Matching:

Semi-dense Matching:

Supported Evaluations :

Repository Overview

The repository is structured as follows:

👉Refer to install.md for details about installation.

👉Refer to evaluation.md for details about evaluation on benchmarks.

Example Code for Quick Testing

To use a specific method to perform the matching task, you simply need to do:

Initialize model

with open('configs/patch2pix.yml', 'r') as f: args = yaml.load(f, Loader=yaml.FullLoader)['example'] model = immatch.dictargs['class'] matcher = lambda im1, im2: model.match_pairs(im1, im2)

Specify the image pair

im1 = 'third_party/patch2pix/examples/images/pair_2/1.jpg' im2 = 'third_party/patch2pix/examples/images/pair_2/2.jpg'

Match and visualize

matches, , , _ = matcher(im1, im2)
plot_matches(im1, im2, matches, radius=2, lines=True)


![example matches](docs/patch2pix_example_matches.png)

#### 👉 Try out the code using [example notebook ](notebooks/visualize_matches_on_example_pairs.ipynb).

## Notice
- This repository is expected to be actively maintained  (at least before I graduate🤣🤣)  and **gradually** (slowly) grow for new features of interest.
- Suggestions regarding how to improve this repo, such as adding new **SotA** image matching methods or new benchmark evaluations, are welcome đź‘Ź.

### Regarding Patch2Pix
With this reprository, one can **reproduce** the tables reported in our  paper accepted at CVPR2021: Patch2Pix: Epipolar-Guided Pixel-Level Correspondences[[pdf]](https://arxiv.org/abs/2012.01909).  Check [our patch2pix repository](https://github.com/GrumpyZhou/patch2pix) for its training code.

###  Disclaimer 
-  All of the supported methods and evaluations are **not implemented from scratch**  by us.  Instead, we modularize their original code to define unified interfaces.
- If you are using the results of a method, **remember to cite the corresponding paper**.
- All credits of the implemetation of those methods belong to their authors .