p’=Hp
, I was able to use the manual
correspondence points to create 2n
linear equations which can then be solved with using
np.linalg.lstsq
.
p’=Hp
but I decided to use inverse-warping like that of project 3 in order to avoid gaps in the pixels. In particular, I had
to handle the possibility of negative coordinates by storing an offset alongside the actual bounding box calculated after
the translation. The bounding box itself though, was calculated with the corners of the picture warped with H
.
As a result, here is the first image warped to the second image's perspective through the homography transformation found
between the manually marked correspondence points:
H
matrices were calculated using the methods defined above and here are the results (after cropping back to
original image dimensions:
dist2
function to find
pairwise distance between every feature patch in image 1 and image 2. Since my image contained a lot of bricks and
repetitive patterns, it wasn't good enough to just take the closest matches as the correspondence points. Instead, I
also used Lowe's trick with the second-closest neighbor to filter the features down to those that were highly
distinctive. In particular, I used a threshold of 0.27
somewhat arbitrarily but mostly based on the
Multi-Image Matching using Multi-Scale Oriented Patches paper where it seemed outliers generally had a distance
of around > 0.4
and correct matches had < 0.1
.