-
Probably good to figure #5 out first.
-
Instead of using my built in cover-tree approximate nearest neighbor lib, new options have popped up:
| | [NMSLIB](https://github.com/nmslib/nmslib) | [HNSWLIB](https://github.com/nmslib/hnswlib) |…
-
I would like to point out that identifiers like “[`_CMD_OPTIONS_H_`](https://github.com/searchivarius/nmslib/blob/b0d05cac24b51cdfac498bac6a09460428fe09f3/similarity_search/include/cmd_options.h#L18 "…
-
Seeing as you're planning a future Julia wrapper, I thought I'd ask for an R one too, so I can try FALCONN instead of [RcppAnnoy](https://github.com/eddelbuettel/rcppannoy).
-
The readme states: "Despite being designed for the cosine similarity, FALCONN can often be used for nearest neighbor search under the Euclidean distance or a maximum inner product search."
So how wou…
-
I built a hash table with about 20,000 data, feature dimension is 256, but when using **find_k_nearest_neighbors** ,it can't find the right one;
and when using [get_unique_candidates] , it may return…
SueeH updated
7 years ago
-
Hi,
Thanks for sharing the code.
I am receiving the following error: No module named '_multiprobe'. Would you please help me with this issue?
Also, what do you mean by "KaHIP" in the prerequisi…
-
Points to cover:
- Linting `clang-format --style=Google -i `
- Running unit tests (ideally also set up some continuous integration service, see #13)
- Pull requests
- What else?
-
I have run your codes in python/benchmark/random_benchmark.py
but if I run your code in high dimension setting
such as with dimension $d=10000$
I want to set params_cp.feature_hashing_dimension =…
-
As far as I understand, the underlying algorithm doesn't require storing the dataset in memory, only the generated hash tables.
Given the favorable computational complexity, falconn is very interest…