nikolamilosevic86 / verifAI

VerifAI initiative to build open-source easy-to-deploy generative question-answering engine that can reference and verify answers for correctness (using posteriori model)
https://verifai-project.com
GNU Affero General Public License v3.0
26 stars 1 forks source link

code not working properly #39

Open PrasannaVenkateshC opened 17 hours ago

PrasannaVenkateshC commented 17 hours ago

I am facing trouble while running this command

pip install -r backend/requirements.txt

nikolamilosevic86 commented 16 hours ago

Can you write a bit more? What kind of trouble are you facing? What is the output?

PrasannaVenkateshC commented 16 hours ago

~/verifAI/backend$ pip install -r requirements.txt Collecting fastapi (from -r requirements.txt (line 1)) Using cached fastapi-0.115.5-py3-none-any.whl.metadata (27 kB) Collecting async-fastapi-jwt-auth (from -r requirements.txt (line 2)) Using cached async_fastapi_jwt_auth-0.6.6-py3-none-any.whl.metadata (3.7 kB) Collecting fastapi-jwt-auth (from -r requirements.txt (line 3)) Using cached fastapi_jwt_auth-0.5.0-py3-none-any.whl.metadata (3.5 kB) Collecting attrs (from -r requirements.txt (line 4)) Using cached attrs-24.2.0-py3-none-any.whl.metadata (11 kB) Collecting bcrypt (from -r requirements.txt (line 5)) Using cached bcrypt-4.2.1-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (9.8 kB) Collecting cmake (from -r requirements.txt (line 6)) Downloading cmake-3.31.1-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.5 kB) Collecting uvicorn (from -r requirements.txt (line 7)) Using cached uvicorn-0.32.1-py3-none-any.whl.metadata (6.6 kB) Collecting fastapi-cli==0.0.4 (from -r requirements.txt (line 8)) Using cached fastapi_cli-0.0.4-py3-none-any.whl.metadata (7.0 kB) Collecting h11 (from -r requirements.txt (line 9)) Using cached h11-0.14.0-py3-none-any.whl.metadata (8.2 kB) Collecting huggingface-hub (from -r requirements.txt (line 10)) Using cached huggingface_hub-0.26.2-py3-none-any.whl.metadata (13 kB) Collecting iniconfig (from -r requirements.txt (line 11)) Using cached iniconfig-2.0.0-py3-none-any.whl.metadata (2.6 kB) Collecting lxml (from -r requirements.txt (line 12)) Using cached lxml-5.3.0-cp312-cp312-manylinux_2_28_x86_64.whl.metadata (3.8 kB) Collecting numpy (from -r requirements.txt (line 13)) Using cached numpy-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (62 kB) Collecting openai (from -r requirements.txt (line 14)) Using cached openai-1.55.0-py3-none-any.whl.metadata (24 kB) Collecting pandas (from -r requirements.txt (line 15)) Using cached pandas-2.2.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (89 kB) Collecting pubmed-parser (from -r requirements.txt (line 16)) Using cached pubmed_parser-0.5.1-py3-none-any.whl.metadata (17 kB) Collecting PyJWT (from -r requirements.txt (line 17)) Using cached PyJWT-2.10.0-py3-none-any.whl.metadata (4.0 kB) Collecting python-dotenv==1.0.1 (from -r requirements.txt (line 18)) Using cached python_dotenv-1.0.1-py3-none-any.whl.metadata (23 kB) Collecting python-multipart (from -r requirements.txt (line 19)) Using cached python_multipart-0.0.17-py3-none-any.whl.metadata (1.8 kB) Collecting PyYAML (from -r requirements.txt (line 20)) Using cached PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB) Collecting regex (from -r requirements.txt (line 21)) Using cached regex-2024.11.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (40 kB) Collecting safetensors (from -r requirements.txt (line 22)) Using cached safetensors-0.4.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.8 kB) Collecting scipy (from -r requirements.txt (line 23)) Using cached scipy-1.14.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (60 kB) Collecting sentencepiece (from -r requirements.txt (line 24)) Using cached sentencepiece-0.2.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.7 kB) Collecting spacy (from -r requirements.txt (line 25)) Using cached spacy-3.8.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (27 kB) Collecting tiktoken (from -r requirements.txt (line 26)) Using cached tiktoken-0.8.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.6 kB) Collecting tokenizers (from -r requirements.txt (line 27)) Using cached tokenizers-0.20.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.7 kB) Collecting torch (from -r requirements.txt (line 28)) Using cached torch-2.5.1-cp312-cp312-manylinux1_x86_64.whl.metadata (28 kB) Collecting tqdm (from -r requirements.txt (line 29)) Downloading tqdm-4.67.1-py3-none-any.whl.metadata (57 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.7/57.7 kB 816.7 kB/s eta 0:00:00 Collecting transformers (from -r requirements.txt (line 30)) Using cached transformers-4.46.3-py3-none-any.whl.metadata (44 kB) Collecting typer (from -r requirements.txt (line 31)) Using cached typer-0.13.1-py3-none-any.whl.metadata (15 kB) Collecting typing_extensions (from -r requirements.txt (line 32)) Using cached typing_extensions-4.12.2-py3-none-any.whl.metadata (3.0 kB) Collecting urllib3 (from -r requirements.txt (line 33)) Using cached urllib3-2.2.3-py3-none-any.whl.metadata (6.5 kB) Collecting yarl (from -r requirements.txt (line 34)) Using cached yarl-1.18.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (67 kB) Collecting zstandard (from -r requirements.txt (line 35)) Using cached zstandard-0.23.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.0 kB) Collecting python-docx (from -r requirements.txt (line 36)) Using cached python_docx-1.1.2-py3-none-any.whl.metadata (2.0 kB) Collecting python-pptx (from -r requirements.txt (line 37)) Using cached python_pptx-1.0.2-py3-none-any.whl.metadata (2.5 kB) Collecting PyPDF2 (from -r requirements.txt (line 38)) Using cached pypdf2-3.0.1-py3-none-any.whl.metadata (6.8 kB) Collecting opensearch-py (from -r requirements.txt (line 39)) Using cached opensearch_py-2.7.1-py3-none-any.whl.metadata (6.9 kB) Collecting qdrant-client (from -r requirements.txt (line 40)) Using cached qdrant_client-1.12.1-py3-none-any.whl.metadata (10 kB) Collecting sentence-transformers (from -r requirements.txt (line 41)) Using cached sentence_transformers-3.3.1-py3-none-any.whl.metadata (10 kB) Collecting peft (from -r requirements.txt (line 42)) Using cached peft-0.13.2-py3-none-any.whl.metadata (13 kB) Collecting nltk (from -r requirements.txt (line 43)) Using cached nltk-3.9.1-py3-none-any.whl.metadata (2.9 kB) Collecting asyncpg (from -r requirements.txt (line 44)) Using cached asyncpg-0.30.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.0 kB) Collecting bitsandbytes (from -r requirements.txt (line 45)) Using cached bitsandbytes-0.44.1-py3-none-manylinux_2_24_x86_64.whl.metadata (3.5 kB) Collecting psycopg2 (from -r requirements.txt (line 46)) Using cached psycopg2-2.9.10.tar.gz (385 kB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> [21 lines of output] running egg_info writing psycopg2.egg-info/PKG-INFO writing dependency_links to psycopg2.egg-info/dependency_links.txt writing top-level names to psycopg2.egg-info/top_level.txt

  Error: pg_config executable not found.

  pg_config is required to build psycopg2 from source.  Please add the directory
  containing pg_config to the $PATH or specify the full executable path with the
  option:

      python setup.py build_ext --pg-config /path/to/pg_config build ...

  or with the pg_config option in 'setup.cfg'.

  If you prefer to avoid building psycopg2 from source, please install the PyPI
  'psycopg2-binary' package instead.

  For further information please check the 'doc/src/install.rst' file (also at
  <https://www.psycopg.org/docs/install.html>).

  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

nikolamilosevic86 commented 15 hours ago

The error happens in installation one of the libraries (psycopg2). You can try some of the following:

  1. Install psycopg2-binary Instead: The error indicates that pg_config is not found, which is required to build psycopg2 from source. To avoid this issue, you can install the pre-compiled binary version of the package, which does not require pg_config. Run the following command:
    pip install psycopg2-binary

    This will install a binary version of psycopg2, which is suitable for most development and testing purposes. (https://www.psycopg.org/docs/install.html)(https://pypi.org/project/psycopg2-binary/2.9.6/)(https://stackoverflow.com/questions/5420789/how-to-install-psycopg2-with-pip-on-python/67925334).

  2. Modify requirements.txt: If you prefer to use psycopg2-binary instead of psycopg2, update your requirements.txt file to replace psycopg2 with psycopg2-binary. Ensure Build Dependencies Are Installed (if building from source is necessary): If you must build psycopg2 from source, ensure that all necessary build dependencies are installed: Install the PostgreSQL development package, which includes pg_config. On Ubuntu, you can do this with:
    sudo apt-get install libpq-dev

    Ensure that Python development headers are installed:

    sudo apt-get install python3-dev

    Add pg_config to PATH (if necessary): If you have installed the PostgreSQL development package but still encounter issues, make sure that the directory containing pg_config is in your system's PATH. You can do this by adding the following line to your shell configuration file (e.g., .bashrc or .zshrc):

    export PATH=/usr/lib/postgresql/X.Y/bin/:$PATH

    Replace X.Y with your PostgreSQL version number. By following these steps, you should be able to resolve the installation error and successfully install the required packages.