airlab-unibas / airlab

Image registration laboratory for 2D and 3D image data
Apache License 2.0
408 stars 92 forks source link

How to recover transformation matrix? #20

Closed abnan closed 4 years ago

abnan commented 4 years ago

Is there a way to recover the transformation matrix at the end of the registration process?

RobinSandkuehler commented 4 years ago

Hi, Yes. You can get the transformation matrix by calling

transformation_matrix = transformation.transformation_matrix()

abnan commented 4 years ago

Is there a different convention to the matrix? For instance if I apply the matrix to my fixed image with say OpenCV, it doesn't quite lead to the same result. I was using the affine_registration_2d.py example from airlab and I added this at the end:

plt.subplot(121)
plt.imshow(cv2.warpAffine(moving_image.numpy(), transformation.transformation_matrix.detach().cpu().numpy(), (moving_image.numpy().shape[1], moving_image.numpy().shape[0])), cmap='grey')
plt.subplot(122)
plt.imshow(warped_image.numpy(), cmap='grey')
plt.show()

The results look like this: image

Can you suggest what needs to be done to have both of them produce the same output with the same matrix?

RobinSandkuehler commented 4 years ago

Hi,

You can try to invert the matrix and then use it with the cv warping method.

bentyeh commented 4 years ago
Tinv_airlab = np.vstack((transformation.transformation_matrix.detach().numpy(), [0, 0, 1]))

fixed_rescale = np.array([[2 / fixed_image.size[1], 0, -1],
                          [0, 2 / fixed_image.size[0], -1],
                          [0, 0, 1]])
moving_rescale = np.array([[2 / moving_image.size[1], 0, -1],
                           [0, 2 / moving_image.size[0], -1],
                           [0, 0, 1]])

Tinv = np.linalg.inv(moving_rescale) @ Tinv_airlab @ fixed_rescale
T = np.linalg.inv(Tinv)

# if using scikit-image or Pillow, which ask for inverse maps
warped_sk_image = skimage.transform.warp(moving_image.numpy(), inverse_map=Tinv)
plt.imshow(warped_sk_image)

# if using opencv, which uses the forward transformation matrix
warped_cv2_image = cv2.warpAffine(moving_image.numpy(), T[0:2, :], (fixed_image.size[1], fixed_image.size[0]))
plt.imshow(warped_cv2_image)

I had the same question and spent some time digging through AirLab's source code to try to figure out how to recover the transformation matrix.

Simply inverting transformation.transformation_matrix is not enough. This is because AirLab computes transformations where pixel coordinates have been scaled to be between [-1, 1] with (-1, -1) as the coordinate of the upper left pixel and (1, 1) as the coordinate of the lower right pixel. See the compute_grid() function: https://github.com/airlab-unibas/airlab/blob/80c9d487c012892c395d63c6d937a67303c321d1/airlab/transformation/utils.py#L22

The motivation for this rescaling is to be able to take advantage of torch.nn.functional.grid_sample for interpolation, which is analogous to the interpolation function scipy.ndimage.map_coordinates used by other Python image libraries like scikit-image.

Consequently, the transformation matrix output by AirLab needs to be scaled back into normal pixel coordinate space. Since GitHub does not support LaTeX in its markdown, I defer the brief mathematical treatment to the following Google Colab notebook: https://gist.github.com/bentyeh/77b5452315ea2591bf86b0c97b84f550