Farzad-R / LLM-Zero-to-Hundred

This repository contains different LLM chatbot projects (RAG, LLM agents, etc.) and well-known techniques for training and fine tuning LLMs.
287 stars 151 forks source link

ERROR: Could not find a version that satisfies the requirement bzip2==1.0.8 (from versions: none) ERROR: No matching distribution found for bzip2==1.0.8 #8

Closed DrivenIdeaLab closed 7 months ago

DrivenIdeaLab commented 7 months ago

Ubuntu wsl Windows 11

Collecting build==1.0.3 (from -r requirements.txt (line 19)) Downloading build-1.0.3-py3-none-any.whl.metadata (4.2 kB) ERROR: Could not find a version that satisfies the requirement bzip2==1.0.8=he774522_0 (from versions: none) ERROR: No matching distribution found for bzip2==1.0.8=he774522_0 (projectenv) user@Nathan:~/LLM-Zero-to-Hundred$ pip install -r requirements.txt

(projectenv) user@Nathan:~/LLM-Zero-to-Hundred$ conda install anaconda::bzip2 Channels:

Package Plan

environment location: /home/user/miniconda3/envs/projectenv

added / updated specs:

The following packages will be downloaded:

package                    |            build
---------------------------|-----------------
bzip2-1.0.8                |       h7b6447c_0         105 KB  anaconda
------------------------------------------------------------
                                       Total:         105 KB

The following packages will be SUPERSEDED by a higher-priority channel:

bzip2 pkgs/main::bzip2-1.0.8-h5eee18b_5 --> anaconda::bzip2-1.0.8-h7b6447c_0

Proceed ([y]/n)? y

Downloading and Extracting Packages:

Preparing transaction: done Verifying transaction: done Executing transaction: done (projectenv) user@Nathan:~/LLM-Zero-to-Hundred$ pip install -r requirements.txt Collecting aiofiles==23.2.1 (from -r requirements.txt (line 4)) Using cached aiofiles-23.2.1-py3-none-any.whl.metadata (9.7 kB) Collecting aiohttp==3.9.1 (from -r requirements.txt (line 5)) Using cached aiohttp-3.9.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.4 kB) Collecting aiosignal==1.3.1 (from -r requirements.txt (line 6)) Using cached aiosignal-1.3.1-py3-none-any.whl.metadata (4.0 kB) Collecting altair==5.2.0 (from -r requirements.txt (line 7)) Using cached altair-5.2.0-py3-none-any.whl.metadata (8.7 kB) Collecting annotated-types==0.6.0 (from -r requirements.txt (line 8)) Using cached annotated_types-0.6.0-py3-none-any.whl.metadata (12 kB) Collecting anyio==3.7.1 (from -r requirements.txt (line 9)) Using cached anyio-3.7.1-py3-none-any.whl.metadata (4.7 kB) Collecting asgiref==3.7.2 (from -r requirements.txt (line 10)) Using cached asgiref-3.7.2-py3-none-any.whl.metadata (9.2 kB) Collecting asyncer==0.0.2 (from -r requirements.txt (line 11)) Using cached asyncer-0.0.2-py3-none-any.whl.metadata (6.8 kB) Collecting attrs==23.2.0 (from -r requirements.txt (line 12)) Using cached attrs-23.2.0-py3-none-any.whl.metadata (9.5 kB) Collecting backoff==2.2.1 (from -r requirements.txt (line 13)) Using cached backoff-2.2.1-py3-none-any.whl.metadata (14 kB) Collecting bcrypt==4.1.2 (from -r requirements.txt (line 14)) Using cached bcrypt-4.1.2-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (9.5 kB) Collecting beautifulsoup4==4.12.2 (from -r requirements.txt (line 15)) Using cached beautifulsoup4-4.12.2-py3-none-any.whl.metadata (3.6 kB) Collecting bidict==0.22.1 (from -r requirements.txt (line 16)) Using cached bidict-0.22.1-py3-none-any.whl.metadata (10 kB) Collecting blinker==1.7.0 (from -r requirements.txt (line 17)) Using cached blinker-1.7.0-py3-none-any.whl.metadata (1.9 kB) Collecting bs4==0.0.1 (from -r requirements.txt (line 18)) Using cached bs4-0.0.1.tar.gz (1.1 kB) Preparing metadata (setup.py) ... done Collecting build==1.0.3 (from -r requirements.txt (line 19)) Using cached build-1.0.3-py3-none-any.whl.metadata (4.2 kB) ERROR: Could not find a version that satisfies the requirement bzip2==1.0.8 (from versions: none) ERROR: No matching distribution found for bzip2==1.0.8

Resolution (projectenv) user@Nathan:~/LLM-Zero-to-Hundred$ sudo apt-get install libbz2-dev

DrivenIdeaLab commented 7 months ago

Resolution

(projectenv) user@Nathan:~/LLM-Zero-to-Hundred$ sudo apt-get install libbz2-dev [sudo] password for user: Reading package lists... Done Building dependency tree... Done Reading state information... Done The following additional packages will be installed: bzip2-doc The following NEW packages will be installed: bzip2-doc libbz2-dev 0 upgraded, 2 newly installed, 0 to remove and 39 not upgraded. Need to get 532 kB of archives. After this operation, 718 kB of additional disk space will be used. Do you want to continue? [Y/n] y Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 bzip2-doc all 1.0.8-5build1 [500 kB] Get:2 http://archive.ubuntu.com/ubuntu jammy/main amd64 libbz2-dev amd64 1.0.8-5build1 [32.5 kB] Fetched 532 kB in 3s (212 kB/s) Selecting previously unselected package bzip2-doc. (Reading database ... 32919 files and directories currently installed.) Preparing to unpack .../bzip2-doc_1.0.8-5build1_all.deb ... Unpacking bzip2-doc (1.0.8-5build1) ... Selecting previously unselected package libbz2-dev:amd64. Preparing to unpack .../libbz2-dev_1.0.8-5build1_amd64.deb ... Unpacking libbz2-dev:amd64 (1.0.8-5build1) ... Setting up bzip2-doc (1.0.8-5build1) ... Setting up libbz2-dev:amd64 (1.0.8-5build1) ... Processing triggers for install-info (6.8-4build1) ... (projectenv) user@Nathan:~/LLM-Zero-to-Hundred$

Farzad-R commented 7 months ago

Hello, I updated the requirements.txt files in WebRAGQuery and WebGPT projects and only included the necessary libraries. This should help you execute the project regardless of the OS. Please let me know if this solves the issue.

DrivenIdeaLab commented 7 months ago

Hello, I updated the requirements.txt files in WebRAGQuery and WebGPT projects and only included the necessary libraries. This should help you execute the project regardless of the OS. Please let me know if this solves the issue.

Thanks Bud, Will clone now and give it a go

DrivenIdeaLab commented 7 months ago

Hello, I updated the requirements.txt files in WebRAGQuery and WebGPT projects and only included the necessary libraries. This should help you execute the project regardless of the OS. Please let me know if this solves the issue.

First issue main dir (projectenv) user@Nathan:~/zero-to-hundred$ cd LLM-Zero-to-Hundred (projectenv) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred$ pip install -r requirements.txt ERROR: Invalid requirement: 'aiofiles=23.2.1=pypi_0' (from line 4 of requirements.txt) Hint: = is not a valid operator. Did you mean == ? (projectenv) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred$

LLM-Zero-to-Hundred/Open-Source-RAG-GEMMA$ pip install -r requirements.txt - success

LLM-Zero-to-Hundred/LLM-Fine-Tuning$ pip install -r requirements.txt - error

ERROR: Could not find a version that satisfies the requirement torch==2.1.1+cu121 (from versions: 1.13.0, 1.13.1, 2.0.0, 2.0.1, 2.1.0, 2.1.1, 2.1.2, 2.2.0, 2.2.1, 2.2.2) ERROR: No matching distribution found for torch==2.1.1+cu121

/LLM-Zero-to-Hundred/WebRAGQuery$zero-to-hundred/LLM-Zero-to-Hundred/RAG-GPT$

(projectenv) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/WebGPT$ pip install -r requirements.txt Requirement already satisfied: duckduckgo_search==3.9.6 in /home/user/miniconda3/envs/projectenv/lib/python3.11/site-packages (from -r requirements.txt (line 1)) (3.9.6) Requirement already satisfied: openai==0.28.0 in /home/user/miniconda3/envs/projectenv/lib/python3.11/site-packages (from -r requirements.txt (line 2)) (0.28.0) Requirement already satisfied: Pillow==10.1.0 in /home/user/miniconda3/envs/projectenv/lib/python3.11/site-packages (from -r requirements.txt (line 3)) (10.1.0) ERROR: Cannot install Pillow==10.1.0 and Pillow==10.3.0 because these package versions have conflicting dependencies.

The conflict is caused by: The user requested Pillow==10.1.0 The user requested Pillow==10.3.0

To fix this you could try to:

  1. loosen the range of package versions you've specified
  2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

LLM-Zero-to-Hundred/RAG-GPT$ pip install -r requirements.txt - success

RAGMaster-LlamaIndex-vs-Langchain$ pip install -r llama_index_env_requirements.txt - Success

/RAGMaster-LlamaIndex-vs-Langchain$ pip install -r llama_index_env_requirements.txt - success

/HUMAIN-advanced-multimodal-chatbot$ - success

Farzad-R commented 7 months ago

The problems are rising due to the fact that these projects were developed for a Windows OS and you are trying to run them on Linux. So, please read the following. All the projects except HUMAIN are developed on Windows OS. HUMAIN was developed on WSL (which can also only be executed on Linux operating systems due to the requirements of bitsandbytes library).

  1. For the projects that require HuggingFace (e.g: Fine-tuning LLMs) you need to install CUDA and torch separately using the guide on their website: https://pytorch.org/
  2. Make sure that your OS has access to GPU since these two projects need it.
  3. Please use a separate environment for each project. (Most of them can actually work with one environment such as RAG-GPT, WebRAGQuery, and WebGPT). But overall having a separate environment is the best practice and can help avoid possible conflicts.

As far as I realized you could successfully install the libraries for these projects:

Would you please confirm this?

So, in case WebRAGQuery and WebGPT you are facing any issues with the libraries such as Pillow, you can eighter remove it so the pip install automatically installs the proper version of it, or you can loosen the version so the pip install finds out the right version automatically.

Please let me know if this could help.

Farzad-R commented 7 months ago

Also since the issues are all coming from the installation of the dependencies, I will close the other 3 and we can communicate here. That would be easier.

DrivenIdeaLab commented 7 months ago

perfect sounds good

DrivenIdeaLab commented 7 months ago

https://docs.google.com/document/d/1MAlJJjrQBtdI1Yqq95qVc7hbHxfvAfbYskAmgRJEW0E/edit?usp=sharing

DrivenIdeaLab commented 7 months ago

./run_setup.sh

`source ../../../python_env/whisper-env/bin/activate python src/app.py (base) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot$ source ../../../python_env/whisper-env/bin/activate -bash: ../../../python_env/whisper-env/bin/activate: No such file or directory (base) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot$ python src/app.py Traceback (most recent call last): File "/home/user/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot/src/app.py", line 1, in import gradio as gr ModuleNotFoundError: No module named 'gradio' (base) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot$ 4 4: command not found (base) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot$ 3 3: command not found (base) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot$ 2 2: command not found (base) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot$ 1 1: command not found (base) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot$ 0 0: command not found (base) user@Nathan:~/zero-to-hundred/LLM-Zero-to-Hundred/HUMAIN-advanced-multimodal-chatbot$

`

Farzad-R commented 7 months ago

If I get it right, you are running the pip install requirements.txt from the LLM-ZERO-To-Hundred folder. Can you please run it in a sub-project? for example: After you clone the repo. Go to LLM-ZERO-To-Hundred folder. Activate your environment and as an example run:

cd WebRAGQuery
pip install requirements.txt

I am going to remove the requirements.txt which is in the main folder (LLM-ZERO-To-Hundred) since that is a bit outdated.

Farzad-R commented 7 months ago

To fix the error that you got in running ./run_setup.sh, you need to simply install the radio library. So, run:

sudo apt update && sudo apt upgrade
pip install --upgrade gradio

Then you are good to go.

DrivenIdeaLab commented 7 months ago

C:\Users\npall\Documents\LLM-Zero-to-Hundred>cd WebGPT

C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebGPT>conda create --name WebGPT Python=3.11 Channels:

Package Plan

environment location: C:\Users\npall\anaconda3\envs\WebGPT

added / updated specs:

The following NEW packages will be INSTALLED:

bzip2 pkgs/main/win-64::bzip2-1.0.8-h2bbff1b_5 ca-certificates pkgs/main/win-64::ca-certificates-2024.3.11-haa95532_0 libffi pkgs/main/win-64::libffi-3.4.4-hd77b12b_0 openssl pkgs/main/win-64::openssl-3.0.13-h2bbff1b_0 pip pkgs/main/win-64::pip-23.3.1-py311haa95532_0 python pkgs/main/win-64::python-3.11.8-he1021f5_0 setuptools pkgs/main/win-64::setuptools-68.2.2-py311haa95532_0 sqlite pkgs/main/win-64::sqlite-3.41.2-h2bbff1b_0 tk pkgs/main/win-64::tk-8.6.12-h2bbff1b_0 tzdata pkgs/main/noarch::tzdata-2024a-h04d1e81_0 vc pkgs/main/win-64::vc-14.2-h21ff451_1 vs2015_runtime pkgs/main/win-64::vs2015_runtime-14.27.29016-h5e58377_2 wheel pkgs/main/win-64::wheel-0.41.2-py311haa95532_0 xz pkgs/main/win-64::xz-5.4.6-h8cc25b3_0 zlib pkgs/main/win-64::zlib-1.2.13-h8cc25b3_0

Proceed ([y]/n)? y

Downloading and Extracting Packages:

Preparing transaction: done Verifying transaction: done Executing transaction: done #

To activate this environment, use

#

$ conda activate WebGPT

#

To deactivate an active environment, use

#

$ conda deactivate

C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebGPT>conda activate WebGPT

(WebGPT) C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebGPT>pip install -r requirements.txt Collecting duckduckgo_search==3.9.6 (from -r requirements.txt (line 1)) Using cached duckduckgo_search-3.9.6-py3-none-any.whl.metadata (21 kB) Collecting openai==0.28.0 (from -r requirements.txt (line 2)) Using cached openai-0.28.0-py3-none-any.whl.metadata (13 kB) Collecting Pillow==10.1.0 (from -r requirements.txt (line 3)) Downloading Pillow-10.1.0-cp311-cp311-win_amd64.whl.metadata (9.6 kB) ERROR: Cannot install Pillow==10.1.0 and Pillow==10.3.0 because these package versions have conflicting dependencies.

The conflict is caused by: The user requested Pillow==10.1.0 The user requested Pillow==10.3.0

To fix this you could try to:

  1. loosen the range of package versions you've specified
  2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

DrivenIdeaLab commented 7 months ago

C:\Users\npall\Documents\LLM-Zero-to-Hundred>conda create -name WebRagQuery Python=3.11 Channels:

PackagesNotFoundError: The following packages are not available from current channels:

Current channels:

To search for alternate channels that may provide the conda package you're looking for, navigate to

https://anaconda.org

and use the search bar at the top of the page.

C:\Users\npall\Documents\LLM-Zero-to-Hundred>conda create --name WebRagQuery Python=3.11 Channels:

Package Plan

environment location: C:\Users\npall\anaconda3\envs\WebRagQuery

added / updated specs:

The following packages will be downloaded:

package                    |            build
---------------------------|-----------------
bzip2-1.0.8                |       h2bbff1b_5          78 KB
libffi-3.4.4               |       hd77b12b_0         113 KB
pip-23.3.1                 |  py311haa95532_0         3.5 MB
python-3.11.8              |       he1021f5_0        18.3 MB
setuptools-68.2.2          |  py311haa95532_0         1.2 MB
tk-8.6.12                  |       h2bbff1b_0         3.1 MB
wheel-0.41.2               |  py311haa95532_0         163 KB
xz-5.4.6                   |       h8cc25b3_0         587 KB
zlib-1.2.13                |       h8cc25b3_0         113 KB
------------------------------------------------------------
                                       Total:        27.1 MB

The following NEW packages will be INSTALLED:

bzip2 pkgs/main/win-64::bzip2-1.0.8-h2bbff1b_5 ca-certificates pkgs/main/win-64::ca-certificates-2024.3.11-haa95532_0 libffi pkgs/main/win-64::libffi-3.4.4-hd77b12b_0 openssl pkgs/main/win-64::openssl-3.0.13-h2bbff1b_0 pip pkgs/main/win-64::pip-23.3.1-py311haa95532_0 python pkgs/main/win-64::python-3.11.8-he1021f5_0 setuptools pkgs/main/win-64::setuptools-68.2.2-py311haa95532_0 sqlite pkgs/main/win-64::sqlite-3.41.2-h2bbff1b_0 tk pkgs/main/win-64::tk-8.6.12-h2bbff1b_0 tzdata pkgs/main/noarch::tzdata-2024a-h04d1e81_0 vc pkgs/main/win-64::vc-14.2-h21ff451_1 vs2015_runtime pkgs/main/win-64::vs2015_runtime-14.27.29016-h5e58377_2 wheel pkgs/main/win-64::wheel-0.41.2-py311haa95532_0 xz pkgs/main/win-64::xz-5.4.6-h8cc25b3_0 zlib pkgs/main/win-64::zlib-1.2.13-h8cc25b3_0

Proceed ([y]/n)? y

Downloading and Extracting Packages:

Preparing transaction: done
Verifying transaction: done
Executing transaction: done

To activate this environment, use

#

$ conda activate WebRagQuery

To deactivate an active environment, use

#

$ conda deactivate

C:\Users\npall\Documents\LLM-Zero-to-Hundred>conda activate WebRagQuery

(WebRagQuery) C:\Users\npall\Documents\LLM-Zero-to-Hundred>cd WebRagQuery

(WebRagQuery) C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery>pip install -r requirements.txt Collecting chainlit==1.0.500 (from -r requirements.txt (line 1)) Downloading chainlit-1.0.500-py3-none-any.whl.metadata (5.5 kB) Collecting duckduckgo_search==3.9.6 (from -r requirements.txt (line 2)) Downloading duckduckgo_search-3.9.6-py3-none-any.whl.metadata (21 kB) Collecting langchain==0.1.14 (from -r requirements.txt (line 3)) Using cached langchain-0.1.14-py3-none-any.whl.metadata (13 kB) Collecting openai==0.28.0 (from -r requirements.txt (line 4)) Using cached openai-0.28.0-py3-none-any.whl.metadata (13 kB) Collecting pandas==2.2.1 (from -r requirements.txt (line 5)) Using cached pandas-2.2.1-cp311-cp311-win_amd64.whl.metadata (19 kB) Collecting pydantic==2.6.4 (from -r requirements.txt (line 6)) Using cached pydantic-2.6.4-py3-none-any.whl.metadata (85 kB) Collecting pyprojroot==0.3.0 (from -r requirements.txt (line 7)) Downloading pyprojroot-0.3.0-py3-none-any.whl.metadata (4.8 kB) Collecting python-dotenv==1.0.1 (from -r requirements.txt (line 8)) Using cached python_dotenv-1.0.1-py3-none-any.whl.metadata (23 kB) Collecting PyYAML==6.0.1 (from -r requirements.txt (line 9)) Using cached PyYAML-6.0.1-cp311-cp311-win_amd64.whl.metadata (2.1 kB) Collecting tiktoken==0.5.1 (from -r requirements.txt (line 11)) Downloading tiktoken-0.5.1-cp311-cp311-win_amd64.whl.metadata (6.8 kB) Collecting aiofiles<24.0.0,>=23.1.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached aiofiles-23.2.1-py3-none-any.whl.metadata (9.7 kB) Collecting asyncer<0.0.3,>=0.0.2 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached asyncer-0.0.2-py3-none-any.whl.metadata (6.8 kB) Collecting click<9.0.0,>=8.1.3 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB) Collecting dataclasses_json<0.6.0,>=0.5.7 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached dataclasses_json-0.5.14-py3-none-any.whl.metadata (22 kB) Collecting fastapi>=0.100 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached fastapi-0.110.1-py3-none-any.whl.metadata (24 kB) Collecting fastapi-socketio<0.0.11,>=0.0.10 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached fastapi_socketio-0.0.10-py3-none-any.whl.metadata (2.6 kB) Collecting filetype<2.0.0,>=1.2.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached filetype-1.2.0-py2.py3-none-any.whl.metadata (6.5 kB) Collecting httpx>=0.23.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached httpx-0.27.0-py3-none-any.whl.metadata (7.2 kB) Collecting lazify<0.5.0,>=0.4.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached Lazify-0.4.0-py2.py3-none-any.whl.metadata (1.4 kB) Collecting literalai==0.0.401 (from chainlit==1.0.500->-r requirements.txt (line 1)) Downloading literalai-0.0.401.tar.gz (32 kB) Preparing metadata (setup.py) ... done Collecting nest-asyncio<2.0.0,>=1.5.6 (from chainlit==1.0.500->-r requirements.txt (line 1)) Downloading nest_asyncio-1.6.0-py3-none-any.whl.metadata (2.8 kB) Collecting packaging<24.0,>=23.1 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached packaging-23.2-py3-none-any.whl.metadata (3.2 kB) Collecting pyjwt<3.0.0,>=2.8.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached PyJWT-2.8.0-py3-none-any.whl.metadata (4.2 kB) Collecting python-graphql-client<0.5.0,>=0.4.3 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached python_graphql_client-0.4.3-py3-none-any.whl.metadata (4.4 kB) Collecting python-multipart<0.0.10,>=0.0.9 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached python_multipart-0.0.9-py3-none-any.whl.metadata (2.5 kB) Collecting starlette<0.33.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Downloading starlette-0.32.0.post1-py3-none-any.whl.metadata (5.8 kB) Collecting syncer<3.0.0,>=2.0.3 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached syncer-2.0.3-py2.py3-none-any.whl Collecting tomli<3.0.0,>=2.0.1 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached tomli-2.0.1-py3-none-any.whl.metadata (8.9 kB) Collecting uptrace<2.0.0,>=1.22.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached uptrace-1.22.0-py3-none-any.whl.metadata (2.3 kB) Collecting uvicorn<0.26.0,>=0.25.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Downloading uvicorn-0.25.0-py3-none-any.whl.metadata (6.4 kB) Collecting watchfiles<0.21.0,>=0.20.0 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached watchfiles-0.20.0-cp37-abi3-win_amd64.whl.metadata (5.0 kB) Collecting lxml>=4.9.3 (from duckduckgo_search==3.9.6->-r requirements.txt (line 2)) Downloading lxml-5.2.1-cp311-cp311-win_amd64.whl.metadata (3.5 kB) Collecting SQLAlchemy<3,>=1.4 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached SQLAlchemy-2.0.29-cp311-cp311-win_amd64.whl.metadata (9.8 kB) Collecting aiohttp<4.0.0,>=3.8.3 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached aiohttp-3.9.3-cp311-cp311-win_amd64.whl.metadata (7.6 kB) Collecting jsonpatch<2.0,>=1.33 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached jsonpatch-1.33-py2.py3-none-any.whl.metadata (3.0 kB) Collecting langchain-community<0.1,>=0.0.30 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached langchain_community-0.0.31-py3-none-any.whl.metadata (8.4 kB) Collecting langchain-core<0.2.0,>=0.1.37 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached langchain_core-0.1.40-py3-none-any.whl.metadata (5.9 kB) Collecting langchain-text-splitters<0.1,>=0.0.1 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached langchain_text_splitters-0.0.1-py3-none-any.whl.metadata (2.0 kB) Collecting langsmith<0.2.0,>=0.1.17 (from langchain==0.1.14->-r requirements.txt (line 3)) Downloading langsmith-0.1.40-py3-none-any.whl.metadata (13 kB) Collecting numpy<2,>=1 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached numpy-1.26.4-cp311-cp311-win_amd64.whl.metadata (61 kB) Collecting requests<3,>=2 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached requests-2.31.0-py3-none-any.whl.metadata (4.6 kB) Collecting tenacity<9.0.0,>=8.1.0 (from langchain==0.1.14->-r requirements.txt (line 3)) Using cached tenacity-8.2.3-py3-none-any.whl.metadata (1.0 kB) Collecting tqdm (from openai==0.28.0->-r requirements.txt (line 4)) Using cached tqdm-4.66.2-py3-none-any.whl.metadata (57 kB) Collecting python-dateutil>=2.8.2 (from pandas==2.2.1->-r requirements.txt (line 5)) Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl.metadata (8.4 kB) Collecting pytz>=2020.1 (from pandas==2.2.1->-r requirements.txt (line 5)) Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB) Collecting tzdata>=2022.7 (from pandas==2.2.1->-r requirements.txt (line 5)) Using cached tzdata-2024.1-py2.py3-none-any.whl.metadata (1.4 kB) Collecting annotated-types>=0.4.0 (from pydantic==2.6.4->-r requirements.txt (line 6)) Using cached annotated_types-0.6.0-py3-none-any.whl.metadata (12 kB) Collecting pydantic-core==2.16.3 (from pydantic==2.6.4->-r requirements.txt (line 6)) Using cached pydantic_core-2.16.3-cp311-none-win_amd64.whl.metadata (6.6 kB) Collecting typing-extensions>=4.6.1 (from pydantic==2.6.4->-r requirements.txt (line 6)) Using cached typing_extensions-4.11.0-py3-none-any.whl.metadata (3.0 kB) Collecting regex>=2022.1.18 (from tiktoken==0.5.1->-r requirements.txt (line 11)) Using cached regex-2023.12.25-cp311-cp311-win_amd64.whl.metadata (41 kB) Collecting chevron>=0.14.0 (from literalai==0.0.401->chainlit==1.0.500->-r requirements.txt (line 1)) Downloading chevron-0.14.0-py3-none-any.whl.metadata (4.9 kB) Collecting aiosignal>=1.1.2 (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.14->-r requirements.txt (line 3)) Using cached aiosignal-1.3.1-py3-none-any.whl.metadata (4.0 kB) Collecting attrs>=17.3.0 (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.14->-r requirements.txt (line 3)) Using cached attrs-23.2.0-py3-none-any.whl.metadata (9.5 kB) Collecting frozenlist>=1.1.1 (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.14->-r requirements.txt (line 3)) Using cached frozenlist-1.4.1-cp311-cp311-win_amd64.whl.metadata (12 kB) Collecting multidict<7.0,>=4.5 (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.14->-r requirements.txt (line 3)) Using cached multidict-6.0.5-cp311-cp311-win_amd64.whl.metadata (4.3 kB) Collecting yarl<2.0,>=1.0 (from aiohttp<4.0.0,>=3.8.3->langchain==0.1.14->-r requirements.txt (line 3)) Using cached yarl-1.9.4-cp311-cp311-win_amd64.whl.metadata (32 kB) Collecting anyio<4.0.0,>=3.4.0 (from asyncer<0.0.3,>=0.0.2->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached anyio-3.7.1-py3-none-any.whl.metadata (4.7 kB) Collecting colorama (from click<9.0.0,>=8.1.3->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached colorama-0.4.6-py2.py3-none-any.whl.metadata (17 kB) Collecting marshmallow<4.0.0,>=3.18.0 (from dataclasses_json<0.6.0,>=0.5.7->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached marshmallow-3.21.1-py3-none-any.whl.metadata (7.2 kB) Collecting typing-inspect<1,>=0.4.0 (from dataclasses_json<0.6.0,>=0.5.7->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached typing_inspect-0.9.0-py3-none-any.whl.metadata (1.5 kB) INFO: pip is looking at multiple versions of fastapi to determine which version is compatible with other requirements. This could take a while. Collecting fastapi>=0.100 (from chainlit==1.0.500->-r requirements.txt (line 1)) Using cached fastapi-0.110.0-py3-none-any.whl.metadata (25 kB) Downloading fastapi-0.109.2-py3-none-any.whl.metadata (25 kB) Downloading fastapi-0.109.1-py3-none-any.whl.metadata (25 kB) Downloading fastapi-0.109.0-py3-none-any.whl.metadata (24 kB) Downloading fastapi-0.108.0-py3-none-any.whl.metadata (24 kB) Collecting python-socketio>=4.6.0 (from fastapi-socketio<0.0.11,>=0.0.10->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached python_socketio-5.11.2-py3-none-any.whl.metadata (3.2 kB) Collecting certifi (from httpx>=0.23.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB) Collecting httpcore==1. (from httpx>=0.23.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached httpcore-1.0.5-py3-none-any.whl.metadata (20 kB) Collecting idna (from httpx>=0.23.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB) Collecting sniffio (from httpx>=0.23.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached sniffio-1.3.1-py3-none-any.whl.metadata (3.9 kB) Collecting h11<0.15,>=0.13 (from httpcore==1.->httpx>=0.23.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached h11-0.14.0-py3-none-any.whl.metadata (8.2 kB) Collecting brotli (from httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 2)) Using cached Brotli-1.1.0-cp311-cp311-win_amd64.whl.metadata (5.6 kB) Collecting h2<5,>=3 (from httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 2)) Downloading h2-4.1.0-py3-none-any.whl.metadata (3.6 kB) Collecting socksio==1.* (from httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 2)) Downloading socksio-1.0.0-py3-none-any.whl.metadata (6.1 kB) Collecting jsonpointer>=1.9 (from jsonpatch<2.0,>=1.33->langchain==0.1.14->-r requirements.txt (line 3)) Using cached jsonpointer-2.4-py2.py3-none-any.whl.metadata (2.5 kB) Collecting orjson<4.0.0,>=3.9.14 (from langsmith<0.2.0,>=0.1.17->langchain==0.1.14->-r requirements.txt (line 3)) Using cached orjson-3.10.0-cp311-none-win_amd64.whl.metadata (50 kB) Collecting six>=1.5 (from python-dateutil>=2.8.2->pandas==2.2.1->-r requirements.txt (line 5)) Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB) Collecting websockets>=5.0 (from python-graphql-client<0.5.0,>=0.4.3->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached websockets-12.0-cp311-cp311-win_amd64.whl.metadata (6.8 kB) Collecting charset-normalizer<4,>=2 (from requests<3,>=2->langchain==0.1.14->-r requirements.txt (line 3)) Using cached charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl.metadata (34 kB) Collecting urllib3<3,>=1.21.1 (from requests<3,>=2->langchain==0.1.14->-r requirements.txt (line 3)) Using cached urllib3-2.2.1-py3-none-any.whl.metadata (6.4 kB) Collecting greenlet!=0.4.17 (from SQLAlchemy<3,>=1.4->langchain==0.1.14->-r requirements.txt (line 3)) Using cached greenlet-3.0.3-cp311-cp311-win_amd64.whl.metadata (3.9 kB) Collecting opentelemetry-api~=1.22 (from uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_api-1.24.0-py3-none-any.whl.metadata (1.3 kB) Collecting opentelemetry-exporter-otlp~=1.22 (from uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_exporter_otlp-1.24.0-py3-none-any.whl.metadata (2.2 kB) Collecting opentelemetry-instrumentation~=0.43b0 (from uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_instrumentation-0.45b0-py3-none-any.whl.metadata (6.1 kB) Collecting opentelemetry-sdk~=1.22 (from uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_sdk-1.24.0-py3-none-any.whl.metadata (1.4 kB) Collecting hyperframe<7,>=6.0 (from h2<5,>=3->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 2)) Downloading hyperframe-6.0.1-py3-none-any.whl.metadata (2.7 kB) Collecting hpack<5,>=4.0 (from h2<5,>=3->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 2)) Downloading hpack-4.0.0-py3-none-any.whl.metadata (2.5 kB) Collecting deprecated>=1.2.6 (from opentelemetry-api~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached Deprecated-1.2.14-py2.py3-none-any.whl.metadata (5.4 kB) Collecting importlib-metadata<=7.0,>=6.0 (from opentelemetry-api~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached importlib_metadata-7.0.0-py3-none-any.whl.metadata (4.9 kB) Collecting opentelemetry-exporter-otlp-proto-grpc==1.24.0 (from opentelemetry-exporter-otlp~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_exporter_otlp_proto_grpc-1.24.0-py3-none-any.whl.metadata (2.2 kB) Collecting opentelemetry-exporter-otlp-proto-http==1.24.0 (from opentelemetry-exporter-otlp~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_exporter_otlp_proto_http-1.24.0-py3-none-any.whl.metadata (2.1 kB) Collecting googleapis-common-protos~=1.52 (from opentelemetry-exporter-otlp-proto-grpc==1.24.0->opentelemetry-exporter-otlp~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached googleapis_common_protos-1.63.0-py2.py3-none-any.whl.metadata (1.5 kB) Collecting grpcio<2.0.0,>=1.0.0 (from opentelemetry-exporter-otlp-proto-grpc==1.24.0->opentelemetry-exporter-otlp~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached grpcio-1.62.1-cp311-cp311-win_amd64.whl.metadata (4.2 kB) Collecting opentelemetry-exporter-otlp-proto-common==1.24.0 (from opentelemetry-exporter-otlp-proto-grpc==1.24.0->opentelemetry-exporter-otlp~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_exporter_otlp_proto_common-1.24.0-py3-none-any.whl.metadata (1.7 kB) Collecting opentelemetry-proto==1.24.0 (from opentelemetry-exporter-otlp-proto-grpc==1.24.0->opentelemetry-exporter-otlp~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_proto-1.24.0-py3-none-any.whl.metadata (2.2 kB) Collecting protobuf<5.0,>=3.19 (from opentelemetry-proto==1.24.0->opentelemetry-exporter-otlp-proto-grpc==1.24.0->opentelemetry-exporter-otlp~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached protobuf-4.25.3-cp310-abi3-win_amd64.whl.metadata (541 bytes) Requirement already satisfied: setuptools>=16.0 in c:\users\npall\anaconda3\envs\webragquery\lib\site-packages (from opentelemetry-instrumentation~=0.43b0->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) (68.2.2) Collecting wrapt<2.0.0,>=1.0.0 (from opentelemetry-instrumentation~=0.43b0->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Downloading wrapt-1.16.0-cp311-cp311-win_amd64.whl.metadata (6.8 kB) Collecting opentelemetry-semantic-conventions==0.45b0 (from opentelemetry-sdk~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached opentelemetry_semantic_conventions-0.45b0-py3-none-any.whl.metadata (2.2 kB) Collecting bidict>=0.21.0 (from python-socketio>=4.6.0->fastapi-socketio<0.0.11,>=0.0.10->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached bidict-0.23.1-py3-none-any.whl.metadata (8.7 kB) Collecting python-engineio>=4.8.0 (from python-socketio>=4.6.0->fastapi-socketio<0.0.11,>=0.0.10->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached python_engineio-4.9.0-py3-none-any.whl.metadata (2.2 kB) Collecting mypy-extensions>=0.3.0 (from typing-inspect<1,>=0.4.0->dataclasses_json<0.6.0,>=0.5.7->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached mypy_extensions-1.0.0-py3-none-any.whl.metadata (1.1 kB) Collecting zipp>=0.5 (from importlib-metadata<=7.0,>=6.0->opentelemetry-api~=1.22->uptrace<2.0.0,>=1.22.0->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached zipp-3.18.1-py3-none-any.whl.metadata (3.5 kB) Collecting simple-websocket>=0.10.0 (from python-engineio>=4.8.0->python-socketio>=4.6.0->fastapi-socketio<0.0.11,>=0.0.10->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached simple_websocket-1.0.0-py3-none-any.whl.metadata (1.3 kB) Collecting wsproto (from simple-websocket>=0.10.0->python-engineio>=4.8.0->python-socketio>=4.6.0->fastapi-socketio<0.0.11,>=0.0.10->chainlit==1.0.500->-r requirements.txt (line 1)) Using cached wsproto-1.2.0-py3-none-any.whl.metadata (5.6 kB) Downloading chainlit-1.0.500-py3-none-any.whl (4.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.4/4.4 MB 12.9 MB/s eta 0:00:00 Downloading duckduckgo_search-3.9.6-py3-none-any.whl (25 kB) Using cached langchain-0.1.14-py3-none-any.whl (812 kB) Using cached openai-0.28.0-py3-none-any.whl (76 kB) Using cached pandas-2.2.1-cp311-cp311-win_amd64.whl (11.6 MB) Using cached pydantic-2.6.4-py3-none-any.whl (394 kB) Downloading pyprojroot-0.3.0-py3-none-any.whl (7.6 kB) Using cached python_dotenv-1.0.1-py3-none-any.whl (19 kB) Using cached PyYAML-6.0.1-cp311-cp311-win_amd64.whl (144 kB) Downloading tiktoken-0.5.1-cp311-cp311-win_amd64.whl (759 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 759.8/759.8 kB 11.9 MB/s eta 0:00:00 Using cached pydantic_core-2.16.3-cp311-none-win_amd64.whl (1.9 MB) Using cached aiofiles-23.2.1-py3-none-any.whl (15 kB) Using cached aiohttp-3.9.3-cp311-cp311-win_amd64.whl (365 kB) Using cached annotated_types-0.6.0-py3-none-any.whl (12 kB) Using cached asyncer-0.0.2-py3-none-any.whl (8.3 kB) Using cached click-8.1.7-py3-none-any.whl (97 kB) Using cached dataclasses_json-0.5.14-py3-none-any.whl (26 kB) Downloading fastapi-0.108.0-py3-none-any.whl (92 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 92.0/92.0 kB ? eta 0:00:00 Using cached fastapi_socketio-0.0.10-py3-none-any.whl (7.4 kB) Using cached filetype-1.2.0-py2.py3-none-any.whl (19 kB) Using cached httpx-0.27.0-py3-none-any.whl (75 kB) Using cached httpcore-1.0.5-py3-none-any.whl (77 kB) Downloading socksio-1.0.0-py3-none-any.whl (12 kB) Using cached jsonpatch-1.33-py2.py3-none-any.whl (12 kB) Using cached langchain_community-0.0.31-py3-none-any.whl (1.9 MB) Downloading langchain_core-0.1.40-py3-none-any.whl (276 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 276.8/276.8 kB 16.7 MB/s eta 0:00:00 Using cached langchain_text_splitters-0.0.1-py3-none-any.whl (21 kB) Downloading langsmith-0.1.40-py3-none-any.whl (87 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 87.5/87.5 kB ? eta 0:00:00 Using cached Lazify-0.4.0-py2.py3-none-any.whl (3.1 kB) Downloading lxml-5.2.1-cp311-cp311-win_amd64.whl (3.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.8/3.8 MB 13.5 MB/s eta 0:00:00 Downloading nest_asyncio-1.6.0-py3-none-any.whl (5.2 kB) Using cached numpy-1.26.4-cp311-cp311-win_amd64.whl (15.8 MB) Using cached packaging-23.2-py3-none-any.whl (53 kB) Using cached PyJWT-2.8.0-py3-none-any.whl (22 kB) Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB) Using cached python_graphql_client-0.4.3-py3-none-any.whl (4.9 kB) Using cached python_multipart-0.0.9-py3-none-any.whl (22 kB) Using cached pytz-2024.1-py2.py3-none-any.whl (505 kB) Using cached regex-2023.12.25-cp311-cp311-win_amd64.whl (269 kB) Using cached requests-2.31.0-py3-none-any.whl (62 kB) Using cached SQLAlchemy-2.0.29-cp311-cp311-win_amd64.whl (2.1 MB) Downloading starlette-0.32.0.post1-py3-none-any.whl (70 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 70.0/70.0 kB ? eta 0:00:00 Using cached tenacity-8.2.3-py3-none-any.whl (24 kB) Using cached tomli-2.0.1-py3-none-any.whl (12 kB) Using cached typing_extensions-4.11.0-py3-none-any.whl (34 kB) Using cached tzdata-2024.1-py2.py3-none-any.whl (345 kB) Using cached uptrace-1.22.0-py3-none-any.whl (8.6 kB) Downloading uvicorn-0.25.0-py3-none-any.whl (60 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.3/60.3 kB 3.1 MB/s eta 0:00:00 Using cached watchfiles-0.20.0-cp37-abi3-win_amd64.whl (276 kB) Using cached tqdm-4.66.2-py3-none-any.whl (78 kB) Using cached aiosignal-1.3.1-py3-none-any.whl (7.6 kB) Using cached anyio-3.7.1-py3-none-any.whl (80 kB) Using cached attrs-23.2.0-py3-none-any.whl (60 kB) Using cached certifi-2024.2.2-py3-none-any.whl (163 kB) Using cached charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl (99 kB) Downloading chevron-0.14.0-py3-none-any.whl (11 kB) Using cached frozenlist-1.4.1-cp311-cp311-win_amd64.whl (50 kB) Using cached greenlet-3.0.3-cp311-cp311-win_amd64.whl (292 kB) Using cached h11-0.14.0-py3-none-any.whl (58 kB) Downloading h2-4.1.0-py3-none-any.whl (57 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.5/57.5 kB ? eta 0:00:00 Using cached idna-3.6-py3-none-any.whl (61 kB) Using cached jsonpointer-2.4-py2.py3-none-any.whl (7.8 kB) Using cached marshmallow-3.21.1-py3-none-any.whl (49 kB) Using cached multidict-6.0.5-cp311-cp311-win_amd64.whl (28 kB) Using cached opentelemetry_api-1.24.0-py3-none-any.whl (60 kB) Using cached opentelemetry_exporter_otlp-1.24.0-py3-none-any.whl (7.0 kB) Using cached opentelemetry_exporter_otlp_proto_grpc-1.24.0-py3-none-any.whl (18 kB) Using cached opentelemetry_exporter_otlp_proto_http-1.24.0-py3-none-any.whl (16 kB) Using cached opentelemetry_exporter_otlp_proto_common-1.24.0-py3-none-any.whl (17 kB) Using cached opentelemetry_proto-1.24.0-py3-none-any.whl (50 kB) Using cached opentelemetry_instrumentation-0.45b0-py3-none-any.whl (28 kB) Using cached opentelemetry_sdk-1.24.0-py3-none-any.whl (106 kB) Using cached opentelemetry_semantic_conventions-0.45b0-py3-none-any.whl (36 kB) Using cached orjson-3.10.0-cp311-none-win_amd64.whl (139 kB) Using cached python_socketio-5.11.2-py3-none-any.whl (75 kB) Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Using cached sniffio-1.3.1-py3-none-any.whl (10 kB) Using cached typing_inspect-0.9.0-py3-none-any.whl (8.8 kB) Using cached urllib3-2.2.1-py3-none-any.whl (121 kB) Using cached websockets-12.0-cp311-cp311-win_amd64.whl (124 kB) Using cached yarl-1.9.4-cp311-cp311-win_amd64.whl (76 kB) Using cached Brotli-1.1.0-cp311-cp311-win_amd64.whl (357 kB) Using cached colorama-0.4.6-py2.py3-none-any.whl (25 kB) Using cached bidict-0.23.1-py3-none-any.whl (32 kB) Using cached Deprecated-1.2.14-py2.py3-none-any.whl (9.6 kB) Downloading hpack-4.0.0-py3-none-any.whl (32 kB) Downloading hyperframe-6.0.1-py3-none-any.whl (12 kB) Using cached importlib_metadata-7.0.0-py3-none-any.whl (23 kB) Using cached mypy_extensions-1.0.0-py3-none-any.whl (4.7 kB) Using cached python_engineio-4.9.0-py3-none-any.whl (57 kB) Downloading wrapt-1.16.0-cp311-cp311-win_amd64.whl (37 kB) Using cached googleapis_common_protos-1.63.0-py2.py3-none-any.whl (229 kB) Using cached grpcio-1.62.1-cp311-cp311-win_amd64.whl (3.8 MB) Using cached simple_websocket-1.0.0-py3-none-any.whl (13 kB) Using cached zipp-3.18.1-py3-none-any.whl (8.2 kB) Using cached protobuf-4.25.3-cp310-abi3-win_amd64.whl (413 kB) Using cached wsproto-1.2.0-py3-none-any.whl (24 kB) Building wheels for collected packages: literalai Building wheel for literalai (setup.py) ... done Created wheel for literalai: filename=literalai-0.0.401-py3-none-any.whl size=39048 sha256=6ecb40496171b3d7583033a41bc44bda10781e31fe9663eb1ee7327e8c414020
Stored in directory: c:\users\npall\appdata\local\pip\cache\wheels\ea\36\bc\209b7758720c0f55cc5275d32d8e5158dd713f4e498353c3ce Successfully built literalai Installing collected packages: syncer, pytz, lazify, filetype, chevron, brotli, zipp, wrapt, websockets, urllib3, tzdata, typing-extensions, tomli, tenacity, socksio, sniffio, six, regex, PyYAML, python-multipart, python-dotenv, pyjwt, protobuf, packaging, orjson, opentelemetry-semantic-conventions, numpy, nest-asyncio, mypy-extensions, multidict, lxml, jsonpointer, idna, hyperframe, hpack, h11, grpcio, greenlet, frozenlist, colorama, charset-normalizer, certifi, bidict, attrs, annotated-types, aiofiles, yarl, wsproto, typing-inspect, tqdm, SQLAlchemy, requests, python-dateutil, pyprojroot, pydantic-core, opentelemetry-proto, marshmallow, jsonpatch, importlib-metadata, httpcore, h2, googleapis-common-protos, deprecated, click, anyio, aiosignal, watchfiles, uvicorn, tiktoken, starlette, simple-websocket, pydantic, pandas, opentelemetry-exporter-otlp-proto-common, opentelemetry-api, httpx, dataclasses_json, asyncer, aiohttp, python-graphql-client, python-engineio, opentelemetry-sdk, opentelemetry-instrumentation, openai, literalai, langsmith, fastapi, python-socketio, opentelemetry-exporter-otlp-proto-http, opentelemetry-exporter-otlp-proto-grpc, langchain-core, duckduckgo_search, opentelemetry-exporter-otlp, langchain-text-splitters, langchain-community, fastapi-socketio, uptrace, langchain, chainlit Successfully installed PyYAML-6.0.1 SQLAlchemy-2.0.29 aiofiles-23.2.1 aiohttp-3.9.3 aiosignal-1.3.1 annotated-types-0.6.0 anyio-3.7.1 asyncer-0.0.2 attrs-23.2.0 bidict-0.23.1 brotli-1.1.0 certifi-2024.2.2 chainlit-1.0.500 charset-normalizer-3.3.2 chevron-0.14.0 click-8.1.7 colorama-0.4.6 dataclasses_json-0.5.14 deprecated-1.2.14 duckduckgo_search-3.9.6 fastapi-0.108.0 fastapi-socketio-0.0.10 filetype-1.2.0 frozenlist-1.4.1 googleapis-common-protos-1.63.0 greenlet-3.0.3 grpcio-1.62.1 h11-0.14.0 h2-4.1.0 hpack-4.0.0 httpcore-1.0.5 httpx-0.27.0 hyperframe-6.0.1 idna-3.6 importlib-metadata-7.0.0 jsonpatch-1.33 jsonpointer-2.4 langchain-0.1.14 langchain-community-0.0.31 langchain-core-0.1.40 langchain-text-splitters-0.0.1 langsmith-0.1.40 lazify-0.4.0 literalai-0.0.401 lxml-5.2.1 marshmallow-3.21.1 multidict-6.0.5 mypy-extensions-1.0.0 nest-asyncio-1.6.0 numpy-1.26.4 openai-0.28.0 opentelemetry-api-1.24.0 opentelemetry-exporter-otlp-1.24.0 opentelemetry-exporter-otlp-proto-common-1.24.0 opentelemetry-exporter-otlp-proto-grpc-1.24.0 opentelemetry-exporter-otlp-proto-http-1.24.0 opentelemetry-instrumentation-0.45b0 opentelemetry-proto-1.24.0 opentelemetry-sdk-1.24.0 opentelemetry-semantic-conventions-0.45b0 orjson-3.10.0 packaging-23.2 pandas-2.2.1 protobuf-4.25.3 pydantic-2.6.4 pydantic-core-2.16.3 pyjwt-2.8.0 pyprojroot-0.3.0 python-dateutil-2.9.0.post0 python-dotenv-1.0.1 python-engineio-4.9.0 python-graphql-client-0.4.3 python-multipart-0.0.9 python-socketio-5.11.2 pytz-2024.1 regex-2023.12.25 requests-2.31.0 simple-websocket-1.0.0 six-1.16.0 sniffio-1.3.1 socksio-1.0.0 starlette-0.32.0.post1 syncer-2.0.3 tenacity-8.2.3 tiktoken-0.5.1 tomli-2.0.1 tqdm-4.66.2 typing-extensions-4.11.0 typing-inspect-0.9.0 tzdata-2024.1 uptrace-1.22.0 urllib3-2.2.1 uvicorn-0.25.0 watchfiles-0.20.0 websockets-12.0 wrapt-1.16.0 wsproto-1.2.0 yarl-1.9.4 zipp-3.18.1

(WebRagQuery) C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery>

Farzad-R commented 7 months ago

Well I can see that WebRAGQuery's environment was installed without any issue. So, you can run the project now. I loosened the version of pillow in WebGPT. It should solve the problem with that project

DrivenIdeaLab commented 7 months ago

C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>python src\langchain\prepare_vectordb.py C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\vectorstores__init__.py:35: LangChainDeprecationWarning: Importing vector stores from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.vectorstores import Chroma.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\document_loaders__init__.py:36: LangChainDeprecationWarning: Importing document loaders from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.document_loaders import PyPDFLoader.

To install langchain-community run pip install -U langchain-community. warnings.warn(

=================== Splitter type: recursive Loading documents manually... Number of loaded documents: 5 Number of pages: 67

Chunking documents... Number of chunks: 262

Preparing vectordb... VectorDB is created and saved. Number of vectors in vectordb: 262

(llamaenv) C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>conda deactivate

C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>conda activate laingchainenv

(laingchainenv) C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>python src\langchain\prepare_vectordb.py C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain\vectorstores__init__.py:35: LangChainDeprecationWarning: Importing vector stores from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.vectorstores import Chroma.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain\document_loaders__init__.py:36: LangChainDeprecationWarning: Importing document loaders from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.document_loaders import PyPDFLoader.

To install langchain-community run pip install -U langchain-community. warnings.warn(

=================== Splitter type: recursive Loading documents manually... Number of loaded documents: 5 Number of pages: 67

Chunking documents... Number of chunks: 262

Preparing vectordb... Traceback (most recent call last): File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain\src\langchain\prepare_vectordb.py", line 32, in prep_langchain_vdb() File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain\src\langchain\prepare_vectordb.py", line 27, in prep_langchain_vdb prepare_vectordb_instance.prepare_and_save_vectordb() File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain\src\langchain\langchain_utils\langchain_index_utils.py", line 133, in prepare_and_save_vectordb vectordb = Chroma.from_documents( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain_community\vectorstores\chroma.py", line 778, in from_documents return cls.from_texts( ^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain_community\vectorstores\chroma.py", line 736, in from_texts chroma_collection.add_texts( File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain_community\vectorstores\chroma.py", line 275, in add_texts embeddings = self._embedding_function.embed_documents(texts) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain_community\embeddings\openai.py", line 668, in embed_documents return self._get_len_safe_embeddings(texts, engine=engine) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain_community\embeddings\openai.py", line 494, in _get_len_safe_embeddings response = embed_with_retry( ^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain_community\embeddings\openai.py", line 124, in embed_with_retry return _embed_with_retry(kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\tenacity__init.py", line 289, in wrapped_f return self(f, *args, **kw) ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\tenacity__init.py", line 379, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\tenacity\init__.py", line 314, in iter return fut.result() ^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\concurrent\futures_base.py", line 449, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\concurrent\futures_base.py", line 401, in get_result raise self._exception File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\tenacity\init.py", line 382, in call__ result = fn(*args, kwargs) ^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\langchain_community\embeddings\openai.py", line 121, in _embed_with_retry response = embeddings.client.create(*kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\openai\api_resources\embedding.py", line 33, in create response = super().create(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 149, in create ) = cls.__prepare_create_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 80, in __prepare_create_request
typed_api_type = cls._get_api_type_and_version(api_type=api_type)[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\openai\api_resources\abstract\api_resource.py", line 169, in _get_api_type_and_version
else ApiType.from_str(openai.api_type) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\laingchainenv\Lib\site-packages\openai\util.py", line 35, in from_str if label.lower() == "azure": ^^^^^^^^^^^ AttributeError: 'NoneType' object has no attribute 'lower'

(laingchainenv) C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>conda deactivate

C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>conda activate llamaenv

(llamaenv) C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>python src\llama_index\prepare_indexes.py C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\chat_models__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.chat_models import ChatAnyscale.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\chat_models__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.chat_models import ChatOpenAI.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\embeddings__init__.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.embeddings import HuggingFaceBgeEmbeddings.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\embeddings__init__.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.embeddings import HuggingFaceEmbeddings.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\llms__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.llms import AI21.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\llms__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.llms import Cohere.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\llms__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.llms import FakeListLLM.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\llms__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.llms import OpenAI.

To install langchain-community run pip install -U langchain-community. warnings.warn( Documents are loaded. Processing the documents and creating the index for Basic RAG... Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 60, in map_httpcore_exceptions yield File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 218, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpcore_sync\connection_pool.py", line 214, in handle_request raise UnsupportedProtocol( httpcore.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 880, in _request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 901, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 929, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 966, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 1002, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 217, in handle_request with map_httpcore_exceptions(): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 77, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 60, in map_httpcore_exceptions yield File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 218, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpcore_sync\connection_pool.py", line 214, in handle_request raise UnsupportedProtocol( httpcore.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 880, in _request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 901, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 929, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 966, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 1002, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 217, in handle_request with map_httpcore_exceptions(): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 77, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 60, in map_httpcore_exceptions yield File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 218, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpcore_sync\connection_pool.py", line 214, in handle_request raise UnsupportedProtocol( httpcore.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 880, in _request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 901, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 929, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 966, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 1002, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 217, in handle_request with map_httpcore_exceptions(): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 77, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain\src\llama_index\prepare_indexes.py", line 84, in prep_llama_indexes() File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain\src\llama_index\prepare_indexes.py", line 40, in prep_llama_indexes index = VectorStoreIndex.from_documents([merged_document], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\base.py", line 107, in from_documents return cls( ^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 52, in init super().init( File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\base.py", line 72, in init index_struct = self.build_index_from_nodes(nodes) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 262, in build_index_from_nodes return self._build_index_from_nodes(nodes, insert_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 243, in _build_index_from_nodes self._add_nodes_to_index( File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 196, in _add_nodes_to_index nodes_batch = self._get_node_with_embedding(nodes_batch, show_progress) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 104, in _get_node_with_embedding id_to_embed_map = embed_nodes( ^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\utils.py", line 137, in embed_nodes new_embeddings = embed_model.get_text_embedding_batch( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\core\embeddings\base.py", line 256, in get_text_embedding_batch embeddings = self._get_text_embeddings(cur_batch) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\embeddings\openai.py", line 386, in _get_text_embeddings return get_embeddings( ^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity__init.py", line 289, in wrapped_f return self(f, *args, **kw) ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity__init.py", line 379, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity__init__.py", line 325, in iter raise retry_exc.reraise() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity\init__.py", line 158, in reraise raise self.last_attempt.result() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\concurrent\futures_base.py", line 449, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\concurrent\futures_base.py", line 401, in get_result raise self._exception File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity\init.py", line 382, in call__ result = fn(*args, kwargs) ^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\embeddings\openai.py", line 162, in get_embeddings data = client.embeddings.create(input=list_of_text, model=engine, **kwargs).data ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai\resources\embeddings.py", line 103, in create return self._post( ^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 1091, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 852, in request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 899, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 961, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 899, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 961, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 908, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.

(llamaenv) C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>pip install -U langchain-community Requirement already satisfied: langchain-community in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (0.0.13) Collecting langchain-community Using cached langchain_community-0.0.31-py3-none-any.whl.metadata (8.4 kB) Requirement already satisfied: PyYAML>=5.3 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-community) (6.0.1) Requirement already satisfied: SQLAlchemy<3,>=1.4 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-community) (2.0.23) Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-community) (3.8.6) Requirement already satisfied: dataclasses-json<0.7,>=0.5.7 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-community) (0.5.14)
Collecting langchain-core<0.2.0,>=0.1.37 (from langchain-community) Using cached langchain_core-0.1.40-py3-none-any.whl.metadata (5.9 kB) Collecting langsmith<0.2.0,>=0.1.0 (from langchain-community) Using cached langsmith-0.1.40-py3-none-any.whl.metadata (13 kB) Requirement already satisfied: numpy<2,>=1 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-community) (1.26.2) Requirement already satisfied: requests<3,>=2 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-community) (2.31.0) Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-community) (8.2.3) Requirement already satisfied: attrs>=17.3.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (23.1.0) Requirement already satisfied: charset-normalizer<4.0,>=2.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (3.3.2) Requirement already satisfied: multidict<7.0,>=4.5 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (6.0.4) Requirement already satisfied: async-timeout<5.0,>=4.0.0a3 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (4.0.3) Requirement already satisfied: yarl<2.0,>=1.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (1.9.2) Requirement already satisfied: frozenlist>=1.1.1 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (1.4.0) Requirement already satisfied: aiosignal>=1.1.2 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (1.3.1) Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from dataclasses-json<0.7,>=0.5.7->langchain-community) (3.20.1) Requirement already satisfied: typing-inspect<1,>=0.4.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from dataclasses-json<0.7,>=0.5.7->langchain-community) (0.9.0) Requirement already satisfied: jsonpatch<2.0,>=1.33 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-core<0.2.0,>=0.1.37->langchain-community) (1.33) Requirement already satisfied: packaging<24.0,>=23.2 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-core<0.2.0,>=0.1.37->langchain-community) (23.2) Requirement already satisfied: pydantic<3,>=1 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from langchain-core<0.2.0,>=0.1.37->langchain-community) (2.5.1) Collecting orjson<4.0.0,>=3.9.14 (from langsmith<0.2.0,>=0.1.0->langchain-community) Using cached orjson-3.10.0-cp311-none-win_amd64.whl.metadata (50 kB) Requirement already satisfied: idna<4,>=2.5 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from requests<3,>=2->langchain-community) (3.4) Requirement already satisfied: urllib3<3,>=1.21.1 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from requests<3,>=2->langchain-community) (1.26.18) Requirement already satisfied: certifi>=2017.4.17 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from requests<3,>=2->langchain-community) (2023.7.22) Requirement already satisfied: typing-extensions>=4.2.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from SQLAlchemy<3,>=1.4->langchain-community) (4.8.0) Requirement already satisfied: greenlet!=0.4.17 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from SQLAlchemy<3,>=1.4->langchain-community) (3.0.1) Requirement already satisfied: jsonpointer>=1.9 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from jsonpatch<2.0,>=1.33->langchain-core<0.2.0,>=0.1.37->langchain-community) (2.4) Requirement already satisfied: annotated-types>=0.4.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from pydantic<3,>=1->langchain-core<0.2.0,>=0.1.37->langchain-community) (0.6.0) Requirement already satisfied: pydantic-core==2.14.3 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from pydantic<3,>=1->langchain-core<0.2.0,>=0.1.37->langchain-community) (2.14.3) Requirement already satisfied: mypy-extensions>=0.3.0 in c:\users\npall\anaconda3\envs\llamaenv\lib\site-packages (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7,>=0.5.7->langchain-community) (1.0.0) Using cached langchain_community-0.0.31-py3-none-any.whl (1.9 MB) Using cached langchain_core-0.1.40-py3-none-any.whl (276 kB) Using cached langsmith-0.1.40-py3-none-any.whl (87 kB) Using cached orjson-3.10.0-cp311-none-win_amd64.whl (139 kB) Installing collected packages: orjson, langsmith, langchain-core, langchain-community Attempting uninstall: orjson Found existing installation: orjson 3.9.10 Uninstalling orjson-3.9.10: Successfully uninstalled orjson-3.9.10 Attempting uninstall: langsmith Found existing installation: langsmith 0.0.79 Uninstalling langsmith-0.0.79: Successfully uninstalled langsmith-0.0.79 Attempting uninstall: langchain-core Found existing installation: langchain-core 0.1.12 Uninstalling langchain-core-0.1.12: Successfully uninstalled langchain-core-0.1.12 Attempting uninstall: langchain-community Found existing installation: langchain-community 0.0.13 Uninstalling langchain-community-0.0.13: Successfully uninstalled langchain-community-0.0.13 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. langchain 0.1.1 requires langsmith<0.1.0,>=0.0.77, but you have langsmith 0.1.40 which is incompatible. Successfully installed langchain-community-0.0.31 langchain-core-0.1.40 langsmith-0.1.40 orjson-3.10.0

(llamaenv) C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>python src\llama_index\prepare_indexes.py C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\chat_models__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.chat_models import ChatAnyscale.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\chat_models__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.chat_models import ChatOpenAI.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\embeddings__init__.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.embeddings import HuggingFaceBgeEmbeddings.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\embeddings__init__.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.embeddings import HuggingFaceEmbeddings.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\llms__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.llms import AI21.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\llms__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.llms import Cohere.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\llms__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.llms import FakeListLLM.

To install langchain-community run pip install -U langchain-community. warnings.warn( C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\langchain\llms__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.llms import OpenAI.

To install langchain-community run pip install -U langchain-community. warnings.warn( Documents are loaded. Processing the documents and creating the index for Basic RAG... Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 60, in map_httpcore_exceptions yield File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 218, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpcore_sync\connection_pool.py", line 214, in handle_request raise UnsupportedProtocol( httpcore.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 880, in _request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 901, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 929, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 966, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 1002, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 217, in handle_request with map_httpcore_exceptions(): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 77, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 60, in map_httpcore_exceptions yield File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 218, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpcore_sync\connection_pool.py", line 214, in handle_request raise UnsupportedProtocol( httpcore.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 880, in _request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 901, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 929, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 966, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 1002, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 217, in handle_request with map_httpcore_exceptions(): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 77, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 60, in map_httpcore_exceptions yield File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 218, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpcore_sync\connection_pool.py", line 214, in handle_request raise UnsupportedProtocol( httpcore.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 880, in _request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 901, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 929, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 966, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_client.py", line 1002, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 217, in handle_request with map_httpcore_exceptions(): File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\httpx_transports\default.py", line 77, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain\src\llama_index\prepare_indexes.py", line 84, in prep_llama_indexes() File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain\src\llama_index\prepare_indexes.py", line 40, in prep_llama_indexes index = VectorStoreIndex.from_documents([merged_document], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\base.py", line 107, in from_documents return cls( ^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 52, in init super().init( File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\base.py", line 72, in init index_struct = self.build_index_from_nodes(nodes) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 262, in build_index_from_nodes return self._build_index_from_nodes(nodes, insert_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 243, in _build_index_from_nodes self._add_nodes_to_index( File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 196, in _add_nodes_to_index nodes_batch = self._get_node_with_embedding(nodes_batch, show_progress) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 104, in _get_node_with_embedding id_to_embed_map = embed_nodes( ^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\indices\utils.py", line 137, in embed_nodes new_embeddings = embed_model.get_text_embedding_batch( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\core\embeddings\base.py", line 256, in get_text_embedding_batch embeddings = self._get_text_embeddings(cur_batch) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\embeddings\openai.py", line 386, in _get_text_embeddings return get_embeddings( ^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity__init.py", line 289, in wrapped_f return self(f, *args, **kw) ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity__init.py", line 379, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity__init__.py", line 325, in iter raise retry_exc.reraise() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity\init__.py", line 158, in reraise raise self.last_attempt.result() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\concurrent\futures_base.py", line 449, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\concurrent\futures_base.py", line 401, in get_result raise self._exception File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\tenacity\init.py", line 382, in call__ result = fn(*args, kwargs) ^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\llama_index\embeddings\openai.py", line 162, in get_embeddings data = client.embeddings.create(input=list_of_text, model=engine, **kwargs).data ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai\resources\embeddings.py", line 103, in create return self._post( ^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 1091, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 852, in request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 899, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 961, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 899, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 961, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\llamaenv\Lib\site-packages\openai_base_client.py", line 908, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.

(llamaenv) C:\Users\npall\Documents\LLM-Zero-to-Hundred\RAGMaster-LlamaIndex-vs-Langchain>

DrivenIdeaLab commented 7 months ago

(webgpt) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\webgpt> pip install -r requirements.txt
Collecting duckduckgo_search==3.9.6 (from -r requirements.txt (line 1)) Using cached duckduckgo_search-3.9.6-py3-none-any.whl.metadata (21 kB) Collecting openai==0.28.0 (from -r requirements.txt (line 2)) Using cached openai-0.28.0-py3-none-any.whl.metadata (13 kB) Collecting Pillow (from -r requirements.txt (line 3)) Using cached pillow-10.3.0-cp311-cp311-win_amd64.whl.metadata (9.4 kB) Collecting pydantic==2.6.4 (from -r requirements.txt (line 4)) Using cached pydantic-2.6.4-py3-none-any.whl.metadata (85 kB) Collecting pyprojroot==0.3.0 (from -r requirements.txt (line 5)) Using cached pyprojroot-0.3.0-py3-none-any.whl.metadata (4.8 kB) Collecting python-dotenv==1.0.1 (from -r requirements.txt (line 6)) Using cached python_dotenv-1.0.1-py3-none-any.whl.metadata (23 kB) Collecting PyYAML==6.0.1 (from -r requirements.txt (line 7)) Using cached PyYAML-6.0.1-cp311-cp311-win_amd64.whl.metadata (2.1 kB) Collecting streamlit==1.28.2 (from -r requirements.txt (line 8)) Using cached streamlit-1.28.2-py2.py3-none-any.whl.metadata (8.1 kB) Collecting streamlit_chat==0.1.1 (from -r requirements.txt (line 9)) Using cached streamlit_chat-0.1.1-py3-none-any.whl.metadata (4.2 kB) Collecting aiofiles>=23.2.1 (from duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached aiofiles-23.2.1-py3-none-any.whl.metadata (9.7 kB) Collecting click>=8.1.7 (from duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB) Collecting lxml>=4.9.3 (from duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached lxml-5.2.1-cp311-cp311-win_amd64.whl.metadata (3.5 kB) Collecting httpx>=0.25.1 (from httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached httpx-0.27.0-py3-none-any.whl.metadata (7.2 kB) Collecting requests>=2.20 (from openai==0.28.0->-r requirements.txt (line 2)) Using cached requests-2.31.0-py3-none-any.whl.metadata (4.6 kB) Collecting tqdm (from openai==0.28.0->-r requirements.txt (line 2)) Using cached tqdm-4.66.2-py3-none-any.whl.metadata (57 kB) Collecting aiohttp (from openai==0.28.0->-r requirements.txt (line 2)) Using cached aiohttp-3.9.3-cp311-cp311-win_amd64.whl.metadata (7.6 kB) Collecting annotated-types>=0.4.0 (from pydantic==2.6.4->-r requirements.txt (line 4)) Using cached annotated_types-0.6.0-py3-none-any.whl.metadata (12 kB) Collecting pydantic-core==2.16.3 (from pydantic==2.6.4->-r requirements.txt (line 4)) Using cached pydantic_core-2.16.3-cp311-none-win_amd64.whl.metadata (6.6 kB) Collecting typing-extensions>=4.6.1 (from pydantic==2.6.4->-r requirements.txt (line 4)) Using cached typing_extensions-4.11.0-py3-none-any.whl.metadata (3.0 kB) Collecting altair<6,>=4.0 (from streamlit==1.28.2->-r requirements.txt (line 8)) Downloading altair-5.3.0-py3-none-any.whl.metadata (9.2 kB) Collecting blinker<2,>=1.0.0 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached blinker-1.7.0-py3-none-any.whl.metadata (1.9 kB) Collecting cachetools<6,>=4.0 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached cachetools-5.3.3-py3-none-any.whl.metadata (5.3 kB) Collecting importlib-metadata<7,>=1.4 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached importlib_metadata-6.11.0-py3-none-any.whl.metadata (4.9 kB) Collecting numpy<2,>=1.19.3 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached numpy-1.26.4-cp311-cp311-win_amd64.whl.metadata (61 kB) Collecting packaging<24,>=16.8 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached packaging-23.2-py3-none-any.whl.metadata (3.2 kB) Collecting pandas<3,>=1.3.0 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached pandas-2.2.1-cp311-cp311-win_amd64.whl.metadata (19 kB) Collecting protobuf<5,>=3.20 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached protobuf-4.25.3-cp310-abi3-win_amd64.whl.metadata (541 bytes) Collecting pyarrow>=6.0 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached pyarrow-15.0.2-cp311-cp311-win_amd64.whl.metadata (3.1 kB) Collecting python-dateutil<3,>=2.7.3 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl.metadata (8.4 kB) Collecting rich<14,>=10.14.0 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached rich-13.7.1-py3-none-any.whl.metadata (18 kB) Collecting tenacity<9,>=8.1.0 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached tenacity-8.2.3-py3-none-any.whl.metadata (1.0 kB) Collecting toml<2,>=0.10.1 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached toml-0.10.2-py2.py3-none-any.whl.metadata (7.1 kB) Collecting tzlocal<6,>=1.1 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached tzlocal-5.2-py3-none-any.whl.metadata (7.8 kB) Collecting validators<1,>=0.2 (from streamlit==1.28.2->-r requirements.txt (line 8)) Downloading validators-0.28.0-py3-none-any.whl.metadata (3.6 kB) Collecting gitpython!=3.1.19,<4,>=3.0.7 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached GitPython-3.1.43-py3-none-any.whl.metadata (13 kB) Collecting pydeck<1,>=0.8.0b4 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached pydeck-0.8.1b0-py2.py3-none-any.whl.metadata (3.9 kB) Collecting tornado<7,>=6.0.3 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached tornado-6.4-cp38-abi3-win_amd64.whl.metadata (2.6 kB) Collecting watchdog>=2.1.5 (from streamlit==1.28.2->-r requirements.txt (line 8)) Using cached watchdog-4.0.0-py3-none-win_amd64.whl.metadata (37 kB) Collecting jinja2 (from altair<6,>=4.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached Jinja2-3.1.3-py3-none-any.whl.metadata (3.3 kB) Collecting jsonschema>=3.0 (from altair<6,>=4.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached jsonschema-4.21.1-py3-none-any.whl.metadata (7.8 kB) Collecting toolz (from altair<6,>=4.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached toolz-0.12.1-py3-none-any.whl.metadata (5.1 kB) Collecting colorama (from click>=8.1.7->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached colorama-0.4.6-py2.py3-none-any.whl.metadata (17 kB) Collecting gitdb<5,>=4.0.1 (from gitpython!=3.1.19,<4,>=3.0.7->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached gitdb-4.0.11-py3-none-any.whl.metadata (1.2 kB) Collecting anyio (from httpx>=0.25.1->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached anyio-4.3.0-py3-none-any.whl.metadata (4.6 kB) Collecting certifi (from httpx>=0.25.1->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB) Collecting httpcore==1. (from httpx>=0.25.1->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached httpcore-1.0.5-py3-none-any.whl.metadata (20 kB) Collecting idna (from httpx>=0.25.1->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB) Collecting sniffio (from httpx>=0.25.1->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached sniffio-1.3.1-py3-none-any.whl.metadata (3.9 kB) Collecting h11<0.15,>=0.13 (from httpcore==1.->httpx>=0.25.1->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached h11-0.14.0-py3-none-any.whl.metadata (8.2 kB) Collecting brotli (from httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached Brotli-1.1.0-cp311-cp311-win_amd64.whl.metadata (5.6 kB) Collecting h2<5,>=3 (from httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached h2-4.1.0-py3-none-any.whl.metadata (3.6 kB) Collecting socksio==1.* (from httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached socksio-1.0.0-py3-none-any.whl.metadata (6.1 kB) Collecting zipp>=0.5 (from importlib-metadata<7,>=1.4->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached zipp-3.18.1-py3-none-any.whl.metadata (3.5 kB) Collecting pytz>=2020.1 (from pandas<3,>=1.3.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB) Collecting tzdata>=2022.7 (from pandas<3,>=1.3.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached tzdata-2024.1-py2.py3-none-any.whl.metadata (1.4 kB) Collecting six>=1.5 (from python-dateutil<3,>=2.7.3->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB) Collecting charset-normalizer<4,>=2 (from requests>=2.20->openai==0.28.0->-r requirements.txt (line 2)) Using cached charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl.metadata (34 kB) Collecting urllib3<3,>=1.21.1 (from requests>=2.20->openai==0.28.0->-r requirements.txt (line 2)) Using cached urllib3-2.2.1-py3-none-any.whl.metadata (6.4 kB) Collecting markdown-it-py>=2.2.0 (from rich<14,>=10.14.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached markdown_it_py-3.0.0-py3-none-any.whl.metadata (6.9 kB) Collecting pygments<3.0.0,>=2.13.0 (from rich<14,>=10.14.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached pygments-2.17.2-py3-none-any.whl.metadata (2.6 kB) Collecting aiosignal>=1.1.2 (from aiohttp->openai==0.28.0->-r requirements.txt (line 2)) Using cached aiosignal-1.3.1-py3-none-any.whl.metadata (4.0 kB) Collecting attrs>=17.3.0 (from aiohttp->openai==0.28.0->-r requirements.txt (line 2)) Using cached attrs-23.2.0-py3-none-any.whl.metadata (9.5 kB) Collecting frozenlist>=1.1.1 (from aiohttp->openai==0.28.0->-r requirements.txt (line 2)) Using cached frozenlist-1.4.1-cp311-cp311-win_amd64.whl.metadata (12 kB) Collecting multidict<7.0,>=4.5 (from aiohttp->openai==0.28.0->-r requirements.txt (line 2)) Using cached multidict-6.0.5-cp311-cp311-win_amd64.whl.metadata (4.3 kB) Collecting yarl<2.0,>=1.0 (from aiohttp->openai==0.28.0->-r requirements.txt (line 2)) Using cached yarl-1.9.4-cp311-cp311-win_amd64.whl.metadata (32 kB) Collecting smmap<6,>=3.0.1 (from gitdb<5,>=4.0.1->gitpython!=3.1.19,<4,>=3.0.7->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached smmap-5.0.1-py3-none-any.whl.metadata (4.3 kB) Collecting hyperframe<7,>=6.0 (from h2<5,>=3->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached hyperframe-6.0.1-py3-none-any.whl.metadata (2.7 kB) Collecting hpack<5,>=4.0 (from h2<5,>=3->httpx[brotli,http2,socks]>=0.25.1->duckduckgo_search==3.9.6->-r requirements.txt (line 1)) Using cached hpack-4.0.0-py3-none-any.whl.metadata (2.5 kB) Collecting MarkupSafe>=2.0 (from jinja2->altair<6,>=4.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl.metadata (3.1 kB) Collecting jsonschema-specifications>=2023.03.6 (from jsonschema>=3.0->altair<6,>=4.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached jsonschema_specifications-2023.12.1-py3-none-any.whl.metadata (3.0 kB) Collecting referencing>=0.28.4 (from jsonschema>=3.0->altair<6,>=4.0->streamlit==1.28.2->-r requirements.txt (line 8)) Downloading referencing-0.34.0-py3-none-any.whl.metadata (2.8 kB) Collecting rpds-py>=0.7.1 (from jsonschema>=3.0->altair<6,>=4.0->streamlit==1.28.2->-r requirements.txt (line 8)) Downloading rpds_py-0.18.0-cp311-none-win_amd64.whl.metadata (4.2 kB) Collecting mdurl~=0.1 (from markdown-it-py>=2.2.0->rich<14,>=10.14.0->streamlit==1.28.2->-r requirements.txt (line 8)) Using cached mdurl-0.1.2-py3-none-any.whl.metadata (1.6 kB) Using cached duckduckgo_search-3.9.6-py3-none-any.whl (25 kB) Using cached openai-0.28.0-py3-none-any.whl (76 kB) Using cached pydantic-2.6.4-py3-none-any.whl (394 kB) Using cached pyprojroot-0.3.0-py3-none-any.whl (7.6 kB) Using cached python_dotenv-1.0.1-py3-none-any.whl (19 kB) Using cached PyYAML-6.0.1-cp311-cp311-win_amd64.whl (144 kB) Using cached streamlit-1.28.2-py2.py3-none-any.whl (8.4 MB) Using cached streamlit_chat-0.1.1-py3-none-any.whl (1.2 MB) Using cached pydantic_core-2.16.3-cp311-none-win_amd64.whl (1.9 MB) Using cached pillow-10.3.0-cp311-cp311-win_amd64.whl (2.5 MB) Using cached aiofiles-23.2.1-py3-none-any.whl (15 kB) Downloading altair-5.3.0-py3-none-any.whl (857 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 857.8/857.8 kB 7.7 MB/s eta 0:00:00 Using cached annotated_types-0.6.0-py3-none-any.whl (12 kB) Using cached blinker-1.7.0-py3-none-any.whl (13 kB) Using cached cachetools-5.3.3-py3-none-any.whl (9.3 kB) Using cached click-8.1.7-py3-none-any.whl (97 kB) Using cached GitPython-3.1.43-py3-none-any.whl (207 kB) Using cached httpx-0.27.0-py3-none-any.whl (75 kB) Using cached httpcore-1.0.5-py3-none-any.whl (77 kB) Using cached socksio-1.0.0-py3-none-any.whl (12 kB) Using cached importlib_metadata-6.11.0-py3-none-any.whl (23 kB) Using cached lxml-5.2.1-cp311-cp311-win_amd64.whl (3.8 MB) Using cached numpy-1.26.4-cp311-cp311-win_amd64.whl (15.8 MB) Using cached packaging-23.2-py3-none-any.whl (53 kB) Using cached pandas-2.2.1-cp311-cp311-win_amd64.whl (11.6 MB) Using cached protobuf-4.25.3-cp310-abi3-win_amd64.whl (413 kB) Using cached pyarrow-15.0.2-cp311-cp311-win_amd64.whl (24.8 MB) Using cached pydeck-0.8.1b0-py2.py3-none-any.whl (4.8 MB) Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB) Using cached requests-2.31.0-py3-none-any.whl (62 kB) Using cached rich-13.7.1-py3-none-any.whl (240 kB) Using cached tenacity-8.2.3-py3-none-any.whl (24 kB) Using cached toml-0.10.2-py2.py3-none-any.whl (16 kB) Downloading tornado-6.4-cp38-abi3-win_amd64.whl (436 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 437.0/437.0 kB 13.8 MB/s eta 0:00:00 Using cached typing_extensions-4.11.0-py3-none-any.whl (34 kB) Using cached tzlocal-5.2-py3-none-any.whl (17 kB) Downloading validators-0.28.0-py3-none-any.whl (39 kB) Using cached watchdog-4.0.0-py3-none-win_amd64.whl (82 kB) Using cached aiohttp-3.9.3-cp311-cp311-win_amd64.whl (365 kB) Using cached tqdm-4.66.2-py3-none-any.whl (78 kB) Using cached aiosignal-1.3.1-py3-none-any.whl (7.6 kB) Using cached attrs-23.2.0-py3-none-any.whl (60 kB) Using cached certifi-2024.2.2-py3-none-any.whl (163 kB) Using cached charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl (99 kB) Using cached frozenlist-1.4.1-cp311-cp311-win_amd64.whl (50 kB) Using cached gitdb-4.0.11-py3-none-any.whl (62 kB) Using cached h2-4.1.0-py3-none-any.whl (57 kB) Using cached idna-3.6-py3-none-any.whl (61 kB) Using cached Jinja2-3.1.3-py3-none-any.whl (133 kB) Downloading jsonschema-4.21.1-py3-none-any.whl (85 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 85.5/85.5 kB ? eta 0:00:00 Using cached markdown_it_py-3.0.0-py3-none-any.whl (87 kB) Using cached multidict-6.0.5-cp311-cp311-win_amd64.whl (28 kB) Using cached pygments-2.17.2-py3-none-any.whl (1.2 MB) Using cached pytz-2024.1-py2.py3-none-any.whl (505 kB) Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Using cached tzdata-2024.1-py2.py3-none-any.whl (345 kB) Using cached urllib3-2.2.1-py3-none-any.whl (121 kB) Using cached yarl-1.9.4-cp311-cp311-win_amd64.whl (76 kB) Using cached zipp-3.18.1-py3-none-any.whl (8.2 kB) Using cached anyio-4.3.0-py3-none-any.whl (85 kB) Using cached sniffio-1.3.1-py3-none-any.whl (10 kB) Using cached Brotli-1.1.0-cp311-cp311-win_amd64.whl (357 kB) Using cached colorama-0.4.6-py2.py3-none-any.whl (25 kB) Downloading toolz-0.12.1-py3-none-any.whl (56 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.1/56.1 kB 3.1 MB/s eta 0:00:00 Using cached h11-0.14.0-py3-none-any.whl (58 kB) Using cached hpack-4.0.0-py3-none-any.whl (32 kB) Using cached hyperframe-6.0.1-py3-none-any.whl (12 kB) Downloading jsonschema_specifications-2023.12.1-py3-none-any.whl (18 kB) Using cached MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl (17 kB) Using cached mdurl-0.1.2-py3-none-any.whl (10.0 kB) Downloading referencing-0.34.0-py3-none-any.whl (26 kB) Downloading rpds_py-0.18.0-cp311-none-win_amd64.whl (206 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 206.7/206.7 kB 786.0 kB/s eta 0:00:00 Using cached smmap-5.0.1-py3-none-any.whl (24 kB) Installing collected packages: pytz, brotli, zipp, watchdog, validators, urllib3, tzdata, typing-extensions, tornado, toolz, toml, tenacity, socksio, sniffio, smmap, six, rpds-py, PyYAML, python-dotenv, pygments, protobuf, Pillow, packaging, numpy, multidict, mdurl, MarkupSafe, lxml, idna, hyperframe, hpack, h11, frozenlist, colorama, charset-normalizer, certifi, cachetools, blinker, attrs, annotated-types, aiofiles, yarl, tzlocal, tqdm, requests, referencing, python-dateutil, pyprojroot, pydantic-core, pyarrow, markdown-it-py, jinja2, importlib-metadata, httpcore, h2, gitdb, click, anyio, aiosignal, rich, pydeck, pydantic, pandas, jsonschema-specifications, httpx, gitpython, aiohttp, openai, jsonschema, duckduckgo_search, altair, streamlit, streamlit_chat Successfully installed MarkupSafe-2.1.5 Pillow-10.3.0 PyYAML-6.0.1 aiofiles-23.2.1 aiohttp-3.9.3 aiosignal-1.3.1 altair-5.3.0 annotated-types-0.6.0 anyio-4.3.0 attrs-23.2.0 blinker-1.7.0 brotli-1.1.0 cachetools-5.3.3 certifi-2024.2.2 charset-normalizer-3.3.2 click-8.1.7 colorama-0.4.6 duckduckgo_search-3.9.6 frozenlist-1.4.1 gitdb-4.0.11 gitpython-3.1.43 h11-0.14.0 h2-4.1.0 hpack-4.0.0 httpcore-1.0.5 httpx-0.27.0 hyperframe-6.0.1 idna-3.6 importlib-metadata-6.11.0 jinja2-3.1.3 jsonschema-4.21.1 jsonschema-specifications-2023.12.1 lxml-5.2.1 markdown-it-py-3.0.0 mdurl-0.1.2 multidict-6.0.5 numpy-1.26.4 openai-0.28.0 packaging-23.2 pandas-2.2.1 protobuf-4.25.3 pyarrow-15.0.2 pydantic-2.6.4 pydantic-core-2.16.3 pydeck-0.8.1b0 pygments-2.17.2 pyprojroot-0.3.0 python-dateutil-2.9.0.post0 python-dotenv-1.0.1 pytz-2024.1 referencing-0.34.0 requests-2.31.0 rich-13.7.1 rpds-py-0.18.0 six-1.16.0 smmap-5.0.1 sniffio-1.3.1 socksio-1.0.0 streamlit-1.28.2 streamlit_chat-0.1.1 tenacity-8.2.3 toml-0.10.2 toolz-0.12.1 tornado-6.4 tqdm-4.66.2 typing-extensions-4.11.0 tzdata-2024.1 tzlocal-5.2 urllib3-2.2.1 validators-0.28.0 watchdog-4.0.0 yarl-1.9.4 zipp-3.18.1 (webgpt) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\webgpt>

DrivenIdeaLab commented 7 months ago

(base) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\LLM-Fine-Tuning> conda create --name llmfinetune python=3.11 Channels:

Package Plan

environment location: C:\Users\npall\anaconda3\envs\llmfinetune

added / updated specs:

The following NEW packages will be INSTALLED:

bzip2 pkgs/main/win-64::bzip2-1.0.8-h2bbff1b_5 ca-certificates pkgs/main/win-64::ca-certificates-2024.3.11-haa95532_0 libffi pkgs/main/win-64::libffi-3.4.4-hd77b12b_0 openssl pkgs/main/win-64::openssl-3.0.13-h2bbff1b_0 pip pkgs/main/win-64::pip-23.3.1-py311haa95532_0 python pkgs/main/win-64::python-3.11.8-he1021f5_0 setuptools pkgs/main/win-64::setuptools-68.2.2-py311haa95532_0 sqlite pkgs/main/win-64::sqlite-3.41.2-h2bbff1b_0 tk pkgs/main/win-64::tk-8.6.12-h2bbff1b_0 tzdata pkgs/main/noarch::tzdata-2024a-h04d1e81_0 vc pkgs/main/win-64::vc-14.2-h21ff451_1 vs2015_runtime pkgs/main/win-64::vs2015_runtime-14.27.29016-h5e58377_2 wheel pkgs/main/win-64::wheel-0.41.2-py311haa95532_0 xz pkgs/main/win-64::xz-5.4.6-h8cc25b3_0 zlib pkgs/main/win-64::zlib-1.2.13-h8cc25b3_0

Proceed ([y]/n)? y

Downloading and Extracting Packages:

Preparing transaction: done Verifying transaction: done Executing transaction: done #

To activate this environment, use

#

$ conda activate llmfinetune

#

To deactivate an active environment, use

#

$ conda deactivate

(base) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\LLM-Fine-Tuning> conda activate llmfinetune (llmfinetune) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\LLM-Fine-Tuning> pip install -r requirements.txt Collecting chainlit==1.0.500 (from -r requirements.txt (line 1)) Using cached chainlit-1.0.500-py3-none-any.whl.metadata (5.5 kB) Collecting datasets==2.15.0 (from -r requirements.txt (line 2)) Using cached datasets-2.15.0-py3-none-any.whl.metadata (20 kB) Collecting fitz==0.0.1.dev2 (from -r requirements.txt (line 3)) Downloading fitz-0.0.1.dev2-py2.py3-none-any.whl.metadata (816 bytes) Collecting jsonlines==4.0.0 (from -r requirements.txt (line 4)) Using cached jsonlines-4.0.0-py3-none-any.whl.metadata (1.6 kB) Collecting openai==0.28.0 (from -r requirements.txt (line 5)) Using cached openai-0.28.0-py3-none-any.whl.metadata (13 kB) Collecting pandas==2.2.1 (from -r requirements.txt (line 6)) Using cached pandas-2.2.1-cp311-cp311-win_amd64.whl.metadata (19 kB) Collecting pydantic==2.6.4 (from -r requirements.txt (line 7)) Using cached pydantic-2.6.4-py3-none-any.whl.metadata (85 kB) Collecting pyprojroot==0.3.0 (from -r requirements.txt (line 8)) Using cached pyprojroot-0.3.0-py3-none-any.whl.metadata (4.8 kB) Collecting python-dotenv==1.0.1 (from -r requirements.txt (line 9)) Using cached python_dotenv-1.0.1-py3-none-any.whl.metadata (23 kB) Collecting PyYAML==6.0.1 (from -r requirements.txt (line 10)) Using cached PyYAML-6.0.1-cp311-cp311-win_amd64.whl.metadata (2.1 kB) ERROR: Could not find a version that satisfies the requirement torch==2.1.1+cu121 (from versions: 2.0.0, 2.0.1, 2.1.0, 2.1.1, 2.1.2, 2.2.0, 2.2.1, 2.2.2) ERROR: No matching distribution found for torch==2.1.1+cu121 (llmfinetune) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\LLM-Fine-Tuning> conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia Channels:

Package Plan

environment location: C:\Users\npall\anaconda3\envs\llmfinetune

added / updated specs:

The following packages will be downloaded:

package                    |            build
---------------------------|-----------------
blas-1.0                   |              mkl           6 KB
brotli-python-1.0.9        |  py311hd77b12b_7         310 KB
certifi-2024.2.2           |  py311haa95532_0         162 KB
charset-normalizer-2.0.4   |     pyhd3eb1b0_0          35 KB
cuda-cccl-12.4.127         |                0         1.4 MB  nvidia
cuda-cudart-12.1.105       |                0         964 KB  nvidia
cuda-cudart-dev-12.1.105   |                0         549 KB  nvidia
cuda-cupti-12.1.105        |                0        11.6 MB  nvidia
cuda-libraries-12.1.0      |                0           1 KB  nvidia
cuda-libraries-dev-12.1.0  |                0           1 KB  nvidia
cuda-nvrtc-12.1.105        |                0        73.2 MB  nvidia
cuda-nvrtc-dev-12.1.105    |                0        16.5 MB  nvidia
cuda-nvtx-12.1.105         |                0          41 KB  nvidia
cuda-opencl-12.4.127       |                0          11 KB  nvidia
cuda-opencl-dev-12.4.127   |                0          75 KB  nvidia
cuda-profiler-api-12.4.127 |                0          19 KB  nvidia
cuda-runtime-12.1.0        |                0           1 KB  nvidia
filelock-3.13.1            |  py311haa95532_0          24 KB
freetype-2.12.1            |       ha860e81_0         490 KB
gmpy2-2.1.2                |  py311h7f96b67_0         140 KB
idna-3.4                   |  py311haa95532_0         101 KB
intel-openmp-2023.1.0      |   h59b6b97_46320         2.7 MB
jinja2-3.1.3               |  py311haa95532_0         354 KB
jpeg-9e                    |       h2bbff1b_1         320 KB
lerc-3.0                   |       hd77b12b_0         120 KB
libcublas-12.1.0.26        |                0          39 KB  nvidia
libcublas-dev-12.1.0.26    |                0       348.3 MB  nvidia
libcufft-11.0.2.4          |                0           6 KB  nvidia
libcufft-dev-11.0.2.4      |                0       102.6 MB  nvidia
libcurand-10.3.5.147       |                0           4 KB  nvidia
libcurand-dev-10.3.5.147   |                0        49.7 MB  nvidia
libcusolver-11.4.4.55      |                0          30 KB  nvidia
libcusolver-dev-11.4.4.55  |                0        95.7 MB  nvidia
libcusparse-12.0.2.55      |                0          12 KB  nvidia
libcusparse-dev-12.0.2.55  |                0       162.5 MB  nvidia
libdeflate-1.17            |       h2bbff1b_1         153 KB
libjpeg-turbo-2.0.0        |       h196d8e1_0         618 KB
libnpp-12.0.2.50           |                0         305 KB  nvidia
libnpp-dev-12.0.2.50       |                0       135.6 MB  nvidia
libnvjitlink-12.1.105      |                0        67.3 MB  nvidia
libnvjitlink-dev-12.1.105  |                0        13.8 MB  nvidia
libnvjpeg-12.1.1.14        |                0           5 KB  nvidia
libnvjpeg-dev-12.1.1.14    |                0         2.4 MB  nvidia
libpng-1.6.39              |       h8cc25b3_0         369 KB
libtiff-4.5.1              |       hd77b12b_0         1.1 MB
libuv-1.44.2               |       h2bbff1b_0         288 KB
libwebp-base-1.3.2         |       h2bbff1b_0         306 KB
lz4-c-1.9.4                |       h2bbff1b_0         143 KB
markupsafe-2.1.3           |  py311h2bbff1b_0          28 KB
mkl-2023.1.0               |   h6b88ed4_46358       155.9 MB
mkl-service-2.4.0          |  py311h2bbff1b_1          44 KB
mkl_fft-1.3.8              |  py311h2bbff1b_0         179 KB
mkl_random-1.2.4           |  py311h59b6b97_0         228 KB
mpc-1.1.0                  |       h7edee0f_1         260 KB
mpfr-4.0.2                 |       h62dcd97_1         1.5 MB
mpir-3.0.0                 |       hec2e145_1         1.3 MB
mpmath-1.3.0               |  py311haa95532_0         1.0 MB
networkx-3.1               |  py311haa95532_0         3.3 MB
numpy-1.26.4               |  py311hdab7c0b_0          11 KB
numpy-base-1.26.4          |  py311hd01c5d8_0         9.1 MB
openjpeg-2.4.0             |       h4fc8c34_0         219 KB
pillow-10.2.0              |  py311h2bbff1b_0         953 KB
pysocks-1.7.1              |  py311haa95532_0          36 KB
pytorch-2.2.2              |py3.11_cuda12.1_cudnn8_0        1.24 GB  pytorch
pytorch-cuda-12.1          |       hde6ce7c_5           4 KB  pytorch
pytorch-mutex-1.0          |             cuda           3 KB  pytorch
pyyaml-6.0.1               |  py311h2bbff1b_0         185 KB
requests-2.31.0            |  py311haa95532_1         125 KB
sympy-1.12                 |  py311haa95532_0        14.4 MB
tbb-2021.8.0               |       h59b6b97_0         149 KB
torchaudio-2.2.2           |      py311_cu121         7.1 MB  pytorch
torchvision-0.17.2         |      py311_cu121         7.9 MB  pytorch
typing_extensions-4.9.0    |  py311haa95532_1          70 KB
urllib3-2.1.0              |  py311haa95532_1         195 KB
win_inet_pton-1.1.0        |  py311haa95532_0          10 KB
yaml-0.2.5                 |       he774522_0          62 KB
zstd-1.5.5                 |       hd43e919_0         682 KB
------------------------------------------------------------
                                       Total:        2.50 GB

The following NEW packages will be INSTALLED:

blas pkgs/main/win-64::blas-1.0-mkl brotli-python pkgs/main/win-64::brotli-python-1.0.9-py311hd77b12b_7 certifi pkgs/main/win-64::certifi-2024.2.2-py311haa95532_0 charset-normalizer pkgs/main/noarch::charset-normalizer-2.0.4-pyhd3eb1b0_0 cuda-cccl nvidia/win-64::cuda-cccl-12.4.127-0 cuda-cudart nvidia/win-64::cuda-cudart-12.1.105-0 cuda-cudart-dev nvidia/win-64::cuda-cudart-dev-12.1.105-0 cuda-cupti nvidia/win-64::cuda-cupti-12.1.105-0 cuda-libraries nvidia/win-64::cuda-libraries-12.1.0-0 cuda-libraries-dev nvidia/win-64::cuda-libraries-dev-12.1.0-0 cuda-nvrtc nvidia/win-64::cuda-nvrtc-12.1.105-0 cuda-nvrtc-dev nvidia/win-64::cuda-nvrtc-dev-12.1.105-0 cuda-nvtx nvidia/win-64::cuda-nvtx-12.1.105-0 cuda-opencl nvidia/win-64::cuda-opencl-12.4.127-0 cuda-opencl-dev nvidia/win-64::cuda-opencl-dev-12.4.127-0 cuda-profiler-api nvidia/win-64::cuda-profiler-api-12.4.127-0 cuda-runtime nvidia/win-64::cuda-runtime-12.1.0-0 filelock pkgs/main/win-64::filelock-3.13.1-py311haa95532_0 freetype pkgs/main/win-64::freetype-2.12.1-ha860e81_0 gmpy2 pkgs/main/win-64::gmpy2-2.1.2-py311h7f96b67_0 idna pkgs/main/win-64::idna-3.4-py311haa95532_0 intel-openmp pkgs/main/win-64::intel-openmp-2023.1.0-h59b6b97_46320 jinja2 pkgs/main/win-64::jinja2-3.1.3-py311haa95532_0 jpeg pkgs/main/win-64::jpeg-9e-h2bbff1b_1 lerc pkgs/main/win-64::lerc-3.0-hd77b12b_0 libcublas nvidia/win-64::libcublas-12.1.0.26-0 libcublas-dev nvidia/win-64::libcublas-dev-12.1.0.26-0 libcufft nvidia/win-64::libcufft-11.0.2.4-0 libcufft-dev nvidia/win-64::libcufft-dev-11.0.2.4-0 libcurand nvidia/win-64::libcurand-10.3.5.147-0 libcurand-dev nvidia/win-64::libcurand-dev-10.3.5.147-0 libcusolver nvidia/win-64::libcusolver-11.4.4.55-0 libcusolver-dev nvidia/win-64::libcusolver-dev-11.4.4.55-0 libcusparse nvidia/win-64::libcusparse-12.0.2.55-0 libcusparse-dev nvidia/win-64::libcusparse-dev-12.0.2.55-0 libdeflate pkgs/main/win-64::libdeflate-1.17-h2bbff1b_1 libjpeg-turbo pkgs/main/win-64::libjpeg-turbo-2.0.0-h196d8e1_0 libnpp nvidia/win-64::libnpp-12.0.2.50-0 libnpp-dev nvidia/win-64::libnpp-dev-12.0.2.50-0 libnvjitlink nvidia/win-64::libnvjitlink-12.1.105-0 libnvjitlink-dev nvidia/win-64::libnvjitlink-dev-12.1.105-0 libnvjpeg nvidia/win-64::libnvjpeg-12.1.1.14-0 libnvjpeg-dev nvidia/win-64::libnvjpeg-dev-12.1.1.14-0 libpng pkgs/main/win-64::libpng-1.6.39-h8cc25b3_0 libtiff pkgs/main/win-64::libtiff-4.5.1-hd77b12b_0 libuv pkgs/main/win-64::libuv-1.44.2-h2bbff1b_0 libwebp-base pkgs/main/win-64::libwebp-base-1.3.2-h2bbff1b_0 lz4-c pkgs/main/win-64::lz4-c-1.9.4-h2bbff1b_0 markupsafe pkgs/main/win-64::markupsafe-2.1.3-py311h2bbff1b_0 mkl pkgs/main/win-64::mkl-2023.1.0-h6b88ed4_46358 mkl-service pkgs/main/win-64::mkl-service-2.4.0-py311h2bbff1b_1 mkl_fft pkgs/main/win-64::mkl_fft-1.3.8-py311h2bbff1b_0 mkl_random pkgs/main/win-64::mkl_random-1.2.4-py311h59b6b97_0 mpc pkgs/main/win-64::mpc-1.1.0-h7edee0f_1 mpfr pkgs/main/win-64::mpfr-4.0.2-h62dcd97_1 mpir pkgs/main/win-64::mpir-3.0.0-hec2e145_1 mpmath pkgs/main/win-64::mpmath-1.3.0-py311haa95532_0 networkx pkgs/main/win-64::networkx-3.1-py311haa95532_0 numpy pkgs/main/win-64::numpy-1.26.4-py311hdab7c0b_0 numpy-base pkgs/main/win-64::numpy-base-1.26.4-py311hd01c5d8_0 openjpeg pkgs/main/win-64::openjpeg-2.4.0-h4fc8c34_0 pillow pkgs/main/win-64::pillow-10.2.0-py311h2bbff1b_0 pysocks pkgs/main/win-64::pysocks-1.7.1-py311haa95532_0 pytorch pytorch/win-64::pytorch-2.2.2-py3.11_cuda12.1_cudnn8_0 pytorch-cuda pytorch/win-64::pytorch-cuda-12.1-hde6ce7c_5 pytorch-mutex pytorch/noarch::pytorch-mutex-1.0-cuda pyyaml pkgs/main/win-64::pyyaml-6.0.1-py311h2bbff1b_0 requests pkgs/main/win-64::requests-2.31.0-py311haa95532_1 sympy pkgs/main/win-64::sympy-1.12-py311haa95532_0 tbb pkgs/main/win-64::tbb-2021.8.0-h59b6b97_0 torchaudio pytorch/win-64::torchaudio-2.2.2-py311_cu121 torchvision pytorch/win-64::torchvision-0.17.2-py311_cu121 typing_extensions pkgs/main/win-64::typing_extensions-4.9.0-py311haa95532_1 urllib3 pkgs/main/win-64::urllib3-2.1.0-py311haa95532_1 win_inet_pton pkgs/main/win-64::win_inet_pton-1.1.0-py311haa95532_0 yaml pkgs/main/win-64::yaml-0.2.5-he774522_0 zstd pkgs/main/win-64::zstd-1.5.5-hd43e919_0

Proceed ([y]/n)? y

Downloading and Extracting Packages:

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
(llmfinetune) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\LLM-Fine-Tuning> pip install -r requirements.txt
Collecting chainlit==1.0.500 (from -r requirements.txt (line 1))
Using cached chainlit-1.0.500-py3-none-any.whl.metadata (5.5 kB)
Collecting datasets==2.15.0 (from -r requirements.txt (line 2))
Using cached datasets-2.15.0-py3-none-any.whl.metadata (20 kB)
Collecting fitz==0.0.1.dev2 (from -r requirements.txt (line 3))
Using cached fitz-0.0.1.dev2-py2.py3-none-any.whl.metadata (816 bytes)
Collecting jsonlines==4.0.0 (from -r requirements.txt (line 4))
Using cached jsonlines-4.0.0-py3-none-any.whl.metadata (1.6 kB)
Collecting openai==0.28.0 (from -r requirements.txt (line 5)) Using cached openai-0.28.0-py3-none-any.whl.metadata (13 kB)
Collecting pandas==2.2.1 (from -r requirements.txt (line 6))
Using cached pandas-2.2.1-cp311-cp311-win_amd64.whl.metadata (19 kB)
Collecting pydantic==2.6.4 (from -r requirements.txt (line 7)) Using cached pydantic-2.6.4-py3-none-any.whl.metadata (85 kB) Collecting pyprojroot==0.3.0 (from -r requirements.txt (line 8)) Using cached pyprojroot-0.3.0-py3-none-any.whl.metadata (4.8 kB) Collecting python-dotenv==1.0.1 (from -r requirements.txt (line 9)) Using cached python_dotenv-1.0.1-py3-none-any.whl.metadata (23 kB) Requirement already satisfied: PyYAML==6.0.1 in c:\users\npall\anaconda3\envs\llmfinetune\lib\site-packages (from -r requirements.txt (line 10)) (6.0.1) ERROR: Could not find a version that satisfies the requirement torch==2.1.1+cu121 (from versions: 2.0.0, 2.0.1, 2.1.0, 2.1.1, 2.1.2, 2.2.0, 2.2.1, 2.2.2) ERROR: No matching distribution found for torch==2.1.1+cu121

DrivenIdeaLab commented 7 months ago

(base) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery> conda activate webragquery (webragquery) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery> (base) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery> chainlit run src\app.py -h

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "C:\Users\npall\anaconda3\Scripts\chainlit.exe__main.py", line 7, in File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 1157, in call return self.main(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 1434, in invoke return ctx.invoke(self.callback, ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 783, in invoke return callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\chainlit\cli__init.py", line 154, in chainlit_run run_chainlit(target) File "C:\Users\npall\Anaconda3\Lib\site-packages\chainlit\cli\init.py", line 55, in run_chainlit load_module(config.run.module_name) File "C:\Users\npall\Anaconda3\Lib\site-packages\chainlit\config.py", line 282, in load_module spec.loader.exec_module(module) File "", line 940, in exec_module File "", line 241, in _call_with_frames_removed File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery\src\app.py", line 26, in from utils.functions_prep import PrepareFunctions File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery\src\utils\functions_prep.py", line 5, in from utils.specific_url_prep_func import prepare_the_requested_url_for_q_and_a File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery\src\utils\specific_url_prep_func.py", line 8, in from utils.prepare_url_vectordb import PrepareURLVectorDB File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery\src\utils\prepare_url_vectordb.py", line 5, in from langchain.document_loaders import WebBaseLoader File "C:\Users\npall\Anaconda3\Lib\site-packages\langchain\document_loaders\init.py", line 18, in from langchain_community.document_loaders.acreom import AcreomLoader File "C:\Users\npall\Anaconda3\Lib\site-packages\langchain_community\document_loaders\init__.py", line 163, in from langchain_community.document_loaders.pebblo import PebbloSafeLoader File "C:\Users\npall\Anaconda3\Lib\site-packages\langchain_community\document_loaders\pebblo.py", line 5, in import pwd ModuleNotFoundError: No module named 'pwd' (base) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery>

DrivenIdeaLab commented 7 months ago

WebGPT loads stramlit

File "C:\Users\npall\anaconda3\envs\webgpt\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 534, in _run_script exec(code, module.dict) File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebGPT\src\webgpt_app.py", line 103, in first_llm_response = Apputils.ask_llm_function_caller( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebGPT\src\utils\app_utils.py", line 97, in ask_llm_function_caller response = openai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\webgpt\Lib\site-packages\openai\api_resources\chat_completion.py", line 25, in create return super().create(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\webgpt\Lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 149, in create ) = cls.__prepare_create_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\webgpt\Lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 80, in __prepare_create_request typed_api_type = cls._get_api_type_and_version(api_type=api_type)[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\webgpt\Lib\site-packages\openai\api_resources\abstract\api_resource.py", line 169, in _get_api_type_and_version else ApiType.from_str(openai.api_type) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\webgpt\Lib\site-packages\openai\util.py", line 35, in from_str if label.lower() == "azure": ^^^^^^^^^^^

Farzad-R commented 7 months ago

So from the messages, I see that WebRAGQuery and WebGPT environments were installed correctly and now you can run the projects. The last issue is rising due to the torch installation that I mentioned earlier. You need to install Pytorch based on your system requirements.

The error with Azure is rising due to the mismatch between OpenAI version. The project requires openai==0.28. Make sure the version matches your openai library. You also need the azure credentials for this. You can find the full description on the youtube videos that I made for each project.

DrivenIdeaLab commented 7 months ago
image
DrivenIdeaLab commented 7 months ago

Ok slowly getting somewhere, just need to figure out the cfg.py files and requirements

You champion

thanks, stilll some errors but gettting there

DrivenIdeaLab commented 7 months ago

(base) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery> chainlit run src\app.py -h Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "C:\Users\npall\anaconda3\Scripts\chainlit.exe__main.py", line 7, in File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 1157, in call return self.main(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 1434, in invoke return ctx.invoke(self.callback, ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\click\core.py", line 783, in invoke return callback(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\Anaconda3\Lib\site-packages\chainlit\cli__init.py", line 154, in chainlit_run run_chainlit(target) File "C:\Users\npall\Anaconda3\Lib\site-packages\chainlit\cli\init.py", line 55, in run_chainlit load_module(config.run.module_name) File "C:\Users\npall\Anaconda3\Lib\site-packages\chainlit\config.py", line 282, in load_module spec.loader.exec_module(module) File "", line 940, in exec_module File "", line 241, in _call_with_frames_removed File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery\src\app.py", line 26, in from utils.functions_prep import PrepareFunctions File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery\src\utils\functions_prep.py", line 5, in from utils.specific_url_prep_func import prepare_the_requested_url_for_q_and_a File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery\src\utils\specific_url_prep_func.py", line 8, in from utils.prepare_url_vectordb import PrepareURLVectorDB File "C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery\src\utils\prepare_url_vectordb.py", line 5, in from langchain.document_loaders import WebBaseLoader File "C:\Users\npall\Anaconda3\Lib\site-packages\langchain\document_loaders\init.py", line 18, in from langchain_community.document_loaders.acreom import AcreomLoader File "C:\Users\npall\Anaconda3\Lib\site-packages\langchain_community\document_loaders\init.py", line 163, in from langchain_community.document_loaders.pebblo import PebbloSafeLoader File "C:\Users\npall\Anaconda3\Lib\site-packages\langchain_community\document_loaders\pebblo.py", line 5, in import pwd ModuleNotFoundError: No module named 'pwd' (base) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery> conda activate webragquery (webragquery) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery> chainlit run src\app.py -h 2024-04-08 05:22:26 - Created default translation directory at C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery.chainlit\translations 2024-04-08 05:22:26 - Created default translation file at C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery.chainlit\translations\de.json 2024-04-08 05:22:26 - Created default translation file at C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery.chainlit\translations\en-US.json 2024-04-08 05:22:26 - Created default translation file at C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery.chainlit\translations\pt-BR.json Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "C:\Users\npall\anaconda3\envs\webragquery\Scripts\chainlit.exe\main.py", line 4, in File "C:\Users\npall\anaconda3\envs\WebRagQuery\Lib\site-packages\chainlit\init__.py", line 24, in from chainlit.action import Action File "C:\Users\npall\anaconda3\envs\WebRagQuery\Lib\site-packages\chainlit\action.py", line 5, in from chainlit.telemetry import trace_event File "C:\Users\npall\anaconda3\envs\WebRagQuery\Lib\site-packages\chainlit\telemetry.py", line 12, in from chainlit.config import config File "C:\Users\npall\anaconda3\envs\WebRagQuery\Lib\site-packages\chainlit\config.py", line 469, in config = load_config() ^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\WebRagQuery\Lib\site-packages\chainlit\config.py", line 438, in load_config settings = load_settings() ^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\WebRagQuery\Lib\site-packages\chainlit\config.py", line 408, in load_settings features_settings = FeaturesSettings(features_settings) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\npall\anaconda3\envs\WebRagQuery\Lib\site-packages\pydantic_internal_dataclasses.py", line 135, in init s.__pydantic_validator__.validate_python(ArgsKwargs(args, kwargs), self_instance=s) pydantic_core._pydantic_core.ValidationError: 1 validation error for FeaturesSettings multi_modal Input should be a dictionary or an instance of MultiModalFeature [type=dataclass_type, input_value=True, input_type=bool] For further information visit https://errors.pydantic.dev/2.6/v/dataclass_type (webragquery) PS C:\Users\npall\Documents\LLM-Zero-to-Hundred\WebRAGQuery>

Farzad-R commented 7 months ago

Did you solve the errors? the last one also seems to be from a missing library. Check for the pydantic version that I set in the project. The version of this library is very important as they have recently changed some parts of it. I also explained it in this video:

https://youtu.be/P3bNGBTDiKM?si=4AHK88yJzZlbJenr

Farzad-R commented 7 months ago

I assume you solved the issues with the dependencies. So, I'll close this