DeepGraphLearning / graphvite

GraphVite: A General and High-performance Graph Embedding System
https://graphvite.io
Apache License 2.0
1.21k stars 151 forks source link

Question in complie graphvite from source: external/faiss/configure not found #73

Closed nxznm closed 3 years ago

nxznm commented 3 years ago

Sorry to bother you! I want to compile graphvite from source. And I just follow the instructions which you give in the readme. But I encounter a problem, here is the result when i did cd build && cmake .. && make && cd -.

-- The CXX compiler identification is GNU 7.3.0 -- The CUDA compiler identification is NVIDIA 10.0.130 -- Check for working CXX compiler: /home/user/miniconda3/envs/graphvite/bin/x86_64-conda_cos6-linux-gnu-c++ -- Check for working CXX compiler: /home/user/miniconda3/envs/graphvite/bin/x86_64-conda_cos6-linux-gnu-c++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Detecting CXX compile features -- Detecting CXX compile features - done -- Check for working CUDA compiler: /usr/local/cuda/bin/nvcc -- Check for working CUDA compiler: /usr/local/cuda/bin/nvcc -- works -- Detecting CUDA compiler ABI info -- Detecting CUDA compiler ABI info - done -- Looking for C++ include pthread.h -- Looking for C++ include pthread.h - found -- Looking for pthread_create -- Looking for pthread_create - not found -- Looking for pthread_create in pthreads -- Looking for pthread_create in pthreads - not found -- Looking for pthread_create in pthread -- Looking for pthread_create in pthread - found -- Found Threads: TRUE -- Found CUDA: /usr/local/cuda (found version "10.0") -- Found Glog: /home/user/miniconda3/envs/graphvite/include -- Found glog (include: /home/user/miniconda3/envs/graphvite/include, library: /home/user/miniconda3/envs/graphvite/lib/libglog.so) -- Found GFlags: /home/user/miniconda3/envs/graphvite/include -- Found gflags (include: /home/user/miniconda3/envs/graphvite/include, library: /home/user/miniconda3/envs/graphvite/lib/libgflags.so) -- Found PythonInterp: /home/user/miniconda3/envs/graphvite/bin/python (found version "3.7.9") -- Found PythonLibs: /home/user/miniconda3/envs/graphvite/lib/libpython3.7m.so -- Performing Test HAS_CPP14_FLAG -- Performing Test HAS_CPP14_FLAG - Success -- Autodetected CUDA architecture(s): 7.5 -- Configuring done -- Generating done -- Build files have been written to: /home/user/LGY/graphvite/build Scanning dependencies of target faiss [ 9%] Creating directories for 'faiss' [ 18%] Performing download step (git clone) for 'faiss' Cloning into 'faiss'... Already on 'master' Your branch is up to date with 'origin/master'. [ 27%] No patch step for 'faiss' [ 36%] Skipping update step for 'faiss' [ 45%] Performing configure step for 'faiss' /bin/sh: 1: /home/user/LGY/graphvite/external/faiss/configure: not found CMakeFiles/faiss.dir/build.make:100: recipe for target 'faiss/src/faiss-stamp/faiss-configure' failed make[2]: [faiss/src/faiss-stamp/faiss-configure] Error 127 CMakeFiles/Makefile2:72: recipe for target 'CMakeFiles/faiss.dir/all' failed make[1]: [CMakeFiles/faiss.dir/all] Error 2 Makefile:83: recipe for target 'all' failed make: *** [all] Error 2

As you can see that the main problem is that external/faiss/configure is not found. And I also find that there is no configure file in faiss project, so I think that maybe graphvite/CMakeLists.txt is not compatible with different versions of faiss (I think the version of faiss which you used is different from the current version of faiss). This is just my assumption, I don't know how to fix the problem. I look forward to your reply. Thanks!

KiddoZhu commented 3 years ago

You are right. FAISS interface often changes.

A simple workaround is to use a fixed version of FAISS, e.g. v1.5.1 or v1.6.1 have been tested ok in #36. You can specify FAISS version in CMakeLists.txt as GIT_TAG v1.6.1

Or manually checkout the corresponding git commit in external/faiss/.

You may also skip FAISS by cmake -DNO_FAISS=True. This will disable the visualization application.

nxznm commented 3 years ago

Thanks for your patience!