johny-c / pylmnn

Large Margin Nearest Neighbors implementation in python
BSD 3-Clause "New" or "Revised" License
48 stars 13 forks source link

Locality Sensitive Hashing / Approximate Nearest Neighbours Techniques #6

Open GeorgePearse opened 2 years ago

GeorgePearse commented 2 years ago

Does anyone know of any similar implementations that uses something like Faiss to improve the performance of the nearest neighbour step of the calculation? If not is it something that would be reasonable to add to this library?

Something like

class FaissKNeighbors:
    def __init__(self, k=5):
        self.index = None
        self.y = None
        self.k = k

    def fit(self, X, y):
        self.index = faiss.IndexFlatL2(X.shape[1])
        self.index.add(X.astype(np.float32))
        self.y = y

    def predict(self, X):
        distances, indices = self.index.search(X.astype(np.float32), k=self.k)
        votes = self.y[indices]
        predictions = np.array([np.argmax(np.bincount(x)) for x in votes])
        return predictions

from https://gist.github.com/j-adamczyk/74ee808ffd53cd8545a49f185a908584#file-knn_with_faiss-py provides an sklearn like interface to faiss and should almost be able to act as a drop in replacement.

As it then leads to an approximate result it should only be added as an option. e.g. nn_backend='faiss' or nn_backend='sklearn'

johny-c commented 2 years ago

This is interesting. Currently you can control the neighbor_params passed to the NearestNeighbors object, such as method (e.g. kdtree or balltree) and n_jobs which can parallelize things. Have you tried these out? It would be good to see how much you would gain with faiss against a well-configured NearestNeighbors. My guess is this will make more of a difference for very large datasets.

GeorgePearse commented 2 years ago

I hadn't carried out much experimentation with the in-built optimizations. Not so relevant to my work for the minute but will try to find time to do so.