Open alibugra opened 3 years ago
ANN is only suitable for algorithms that output a final embedding, which is typically a vector. Wide&Deep is a deep learning algorithm, and its final output is a score, which is a scalar.
I did use ANN(hnswlib) in KnnEmbeddingApproximate, and this is just a start. I plan to implement some graph embedding algorithms which will all generate a final embedding, and ANN may play an important role in them.
np.argpartition
is used to retrieval top n items among the whole item set based on the computed score. You may wonder why using np.argpartition
instead of np.argsort
. That's because partition has time complexity of O(n), whereas quick-sort is O(nlogn).
In the "recommend_user" method of the "WideDeep" class, it uses "np.argpartition" and there are some other methods. I do not see any nearest neighbors library. (such as; Faiss, ScaNN)
Why wasn't any ANN(Approximate Nearest Neighbors) used in recommendation methods, does it work more efficiently or faster this way?