Closed hayj closed 4 months ago
Strangely, I made a local environment in JupyterLab which is able to launch the script, but I don't manage to reproduce installation command lines to make it works in our docker. The pip freeze gives this:
faiss==1.7.4
faiss-cpu==1.7.4
faiss-gpu==1.7.2
When I execute this demo script, it works:
https://github.com/facebookresearch/faiss/blob/main/tutorial/python/4-GPU.py
But when I set nq = 100000
, it fails.
So it seems it fails with certain combinations of dim + dataset size + batch size under Faiss version 1.7.3 (not 1.7.4). Hence the fix is to reduce the batch size (i.e. to split vectors when searching in my main message above).
Note that in some cases it also fails under processes using multiprocessing, so in my code, I replaced multiprocessing by threading.
I re-opened this issue in case it's necessary to solve this issue even on newer versions.
Please install with anaconda.
Summary
Get a CUDA error when searching in a Faiss index having too small vectors
or
Platform
Installed from: https://github.com/kyamagu/faiss-wheels/releases/download/v1.7.3/faiss_gpu-1.7.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl#sha256=1a9755132bb81dc3daecd16e0b5471ddf0246e555a889ea311813f867fdcca88
Running on:
Interface:
Reproduction instructions
This code works with
dim = 1024
.I also tried with different indexes and different parameters (nlist, etc.) but it always fails for a certain vector size (and not when increasing the size).
When I try to install different versions of Faiss (nightly and old version) I face incompatibility issues such as:
or