oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
40.08k stars 5.26k forks source link

Miniconda hook not found #1351

Closed Kostazs closed 11 months ago

Kostazs commented 1 year ago

Describe the bug

When I try to click one of the files from the extracted oobagooga folder I am hit with the error shown below.

Is there an existing issue for this?

Reproduction

I clicked the "one-click installers" oobagooba_windows.zip I instaled the folder in my D drive where I install most of my folders. I extract the folder and notice that I have some missing folders according to the youtube tutorial that im watching https://youtu.be/FOyqcETVUCs?t=112 (linked with timestamp) but I brush it off and continue following along. When I try to run any folder. I get resulted with that error message, i tried searching for a solution but I found nothing and it has made me to think that the solution is probably something I haven't installed that everyone has or just installed.

Screenshot

error error 2

Logs

The system cannot find the path specified.

Miniconda hook not found.
Press any key to continue . . .

System Info

Windows 10
64 bit
GT 1030 2GB VRAM
RYZEN 3 1200 quad-core
16gb of ram 2500mhz
jllllll commented 1 year ago

The installer was overhauled yesterday for cross-platform support. Any tutorials written before that are now out-dated. Run start-windows.bat to install and start the webui. I'm assuming you were trying to run the update script?

desva commented 1 year ago

I had to debug this last night - if you still see this error, it is likely that environment variable TMP is incorrectly set (it needs to match install drive, and folder needs to exist - this fixed install for me)

xNul commented 1 year ago

I had to debug this last night - if you still see this error, it is likely that environment variable TMP is incorrectly set (it needs to match install drive, and folder needs to exist - this fixed install for me)

@desva this line of code fixed it for you?

set TMP=D:\temp
Kostazs commented 1 year ago

I had to debug this last night - if you still see this error, it is likely that environment variable TMP is incorrectly set (it needs to match install drive, and folder needs to exist - this fixed install for me)

Thank you bro! this fixed my problem.

Kostazs commented 1 year ago

I had to debug this last night - if you still see this error, it is likely that environment variable TMP is incorrectly set (it needs to match install drive, and folder needs to exist - this fixed install for me)

@desva this line of code fixed it for you?

set TMP=D:\temp

i changed the tmp file directory like this from this video https://youtu.be/JQfPSj7yYJw go to Control Panel->System->Advanced, and enter the "Environment Variables" dialog window to find and change the TEMP and TMP variable settings in the "User variables".

desva commented 1 year ago

Yes - just need to set TMP in the batch file to match the install drive (and ensure folder exists).

xNul commented 1 year ago

@desva interesting, I'm not able to reproduce on my Windows system on my D drive. Could you confirm these changes fix the installation process for you? https://github.com/xNul/one-click-installers

johkhknlknkjhk commented 1 year ago

Qucik question can someone tell me what EXACTLY to write on the TEMP and TMP ?

desva commented 1 year ago

You need to set the TMP environment variable to point to a temporary folder on the same drive as your installation. Sent from my iPhone, please excuse any errors. On 24 Apr 2023, at 04:46, johkhknlknkjhk @.***> wrote: Qucik question can someone tell me what EXACTLY to write on the TEMP and TMP ?

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>

jllllll commented 1 year ago

@johkhknlknkjhk The installer that xNul linked already has it set.

xNul commented 1 year ago

@johkhknlknkjhk did the changes I made work for you?

kodra-dev commented 1 year ago

Still doesn't work for me. Got

The system cannot find the path specified.

Miniconda hook not found.
Press any key to continue . . .

Even after I set TMP=C:\temp

xNul commented 1 year ago

@kodra-dev I'm going to need some more info like what OS are you running, what's your install log, and what drive are you using?

Webslug commented 1 year ago

Yeah this definitely no longer works and I am getting the same error - Very annoyed

Mine is installed in a folder called AI with no spaces, I have made the TEMP folder and I still get the error. I've tried all the solutions I can think of and I'm about to give up. I set the environment variables and tried to install miniconda manually.

Fatal Python error: init_fs_encoding: failed to get the Python codec of the filesystem encoding Python runtime state: core initialized ModuleNotFoundError: No module named 'encodings'

Current thread 0x00008fb0 (most recent call first):

Miniconda hook not found.
jllllll commented 1 year ago

Yeah this definitely no longer works and I am getting the same error - It is clear that this no longer works.

Mine is installed in a folder called AI with no spaces, I have made the TEMP folder and I still get the error. I've tried all the solutions I can think of and I'm about to give up.

Fatal Python error: init_fs_encoding: failed to get the Python codec of the filesystem encoding Python runtime state: core initialized ModuleNotFoundError: No module named 'encodings'

Current thread 0x00008fb0 (most recent call first):

Miniconda hook not found.

Try adding this to start_windows.bat just after set TEMP=%cd%\installer_files:

SET PYTHONNOUSERSITE=1
SET PYTHONPATH=
Webslug commented 1 year ago

Thanks Jlllll, I added the two lines which you suggested however I still get the error, can you help further?

SET PYTHONNOUSERSITE=1 SET PYTHONPATH=C:\Program Files\Python310

Python path configuration: PYTHONHOME = 'C:\Program Files\Python310\Scripts' PYTHONPATH = 'C:\Program Files\Python310' program name = 'D:\AI\oobabooga\installer_files\conda\python.exe' isolated = 0 environment = 1 user site = 0 import site = 1 sys._base_executable = 'D:\AI\oobabooga\installer_files\conda\python.exe' sys.base_prefix = 'C:\Program Files\Python310\Scripts' sys.base_exec_prefix = 'C:\Program Files\Python310\Scripts' sys.platlibdir = 'lib' sys.executable = 'D:\AI\oobabooga\installer_files\conda\python.exe' sys.prefix = 'C:\Program Files\Python310\Scripts' sys.exec_prefix = 'C:\Program Files\Python310\Scripts' sys.path = [ 'C:\Program Files\Python310', 'D:\AI\oobabooga\installer_files\conda\python310.zip', 'C:\Program Files\Python310\Scripts\DLLs', 'C:\Program Files\Python310\Scripts\lib', 'D:\AI\oobabooga\installer_files\conda', ] Fatal Python error: init_fs_encoding: failed to get the Python codec of the filesystem encoding Python runtime state: core initialized ModuleNotFoundError: No module named 'encodings'

Current thread 0x00004090 (most recent call first):

Miniconda hook not found.
Webslug commented 1 year ago

Numerous people on Twitter also mentioned that the new installer is broken and they also could not get it to install.

I can confirm this one click installer no longer works on Windows with Python installed.

jllllll commented 1 year ago

@Webslug Why did you add a path to PYTHONPATH? It was left deliberately blank to clear it. Try it again with PYTHONHOME as well:

SET PYTHONNOUSERSITE=1
SET PYTHONPATH=
SET PYTHONHOME=

PYTHONPATH and PYTHONHOME should never be set to anything. I replicated this exact error by setting those variables to a path. They need to be unset.

Webslug commented 1 year ago

Thanks jllllll that seems to have solved the problem.

After I selected A for Nvidia, the script downloads a bunch of files, however after 5 minutes, it just seems to stop and I am worried it has crashed. Am I just being impatient?

jllllll commented 1 year ago

Thanks jllllll that seems to have solved the problem.

After I selected A for Nvidia, the script downloads a bunch of files, however after 5 minutes, it just seems to stop and I am worried it has crashed. Am I just being impatient?

You'll know if it crashed. The script uses Miniconda to download things, which has a fairly small limit on the amount of downloads it shows. Sometimes, for unknown reasons, it will take longer than usual and seem like it has frozen.

SuperFurias commented 1 year ago

I had the same issue, all my files inside the folder started showing up the error "Miniconda hook not found" and "The system cannot find the path specified"

...after some time.. i tried manually installing miniconda inside my folder. by opening the file that the bat file automatically downloaded.. and... well.. ...it seems that miniconda is unable to install in folders that start with "!" inside. ..my folder was called "!TextGenerationGUI" because i wanted it to always stay on top of other folders.. ..but well.. miniconda is unable to install in folders with "!". so i had rename my folder. ..now, after removing the "!" from the folder name, the problem seems to be solved.

SuperFurias commented 1 year ago

I had the same issue, all my files inside the folder started showing up the error "Miniconda hook not found" and "The system cannot find the path specified"

...after some time.. i tried manually installing miniconda inside my folder. by opening the file that the bat file automatically downloaded.. and... well.. ...it seems that miniconda is unable to install in folders that start with "!" inside. ..my folder was called "!TextGenerationGUI" because i wanted it to always stay on top of other folders.. ..but well.. miniconda is unable to install in folders with "!". so i had rename my folder. ..now, after removing the "!" from the folder name, the problem seems to be solved.

I think that the installer should warn users that it is impossible to run/install the text generation webui inside folders that have special characters like ! ...even if i doubt anyone beside me puts ! inside the folder name because he wants the folder to always be on top of other folders..

BigBlueGX commented 1 year ago

I have the same issue and my dir is C:\SD\oobabooga_windows

The automatic installer doesn't work. It gives a Miniconda hook not found error

jllllll commented 1 year ago

I had the same issue, all my files inside the folder started showing up the error "Miniconda hook not found" and "The system cannot find the path specified" ...after some time.. i tried manually installing miniconda inside my folder. by opening the file that the bat file automatically downloaded.. and... well.. ...it seems that miniconda is unable to install in folders that start with "!" inside. ..my folder was called "!TextGenerationGUI" because i wanted it to always stay on top of other folders.. ..but well.. miniconda is unable to install in folders with "!". so i had rename my folder. ..now, after removing the "!" from the folder name, the problem seems to be solved.

I think that the installer should warn users that it is impossible to run/install the text generation webui inside folders that have special characters like ! ...even if i doubt anyone beside me puts ! inside the folder name because he wants the folder to always be on top of other folders..

This is pretty good info to know.

jllllll commented 1 year ago

I have the same issue and my dir is C:\SD\oobabooga_windows

The automatic installer doesn't work. It gives a Miniconda hook not found error

What does the Miniconda installer do if you try to run it manually?

BigBlueGX commented 1 year ago

I have the same issue and my dir is C:\SD\oobabooga_windows The automatic installer doesn't work. It gives a Miniconda hook not found error

What does the Miniconda installer do if you try to run it manually?

I tried downloading the automatic installer and ran it just now. I get this error:

Conda environment is empty. Press any key to continue . . .

Here is the installation log

Downloading Miniconda from https://repo.anaconda.com/miniconda/Miniconda3-py310_23.1.0-1-Windows-x86_64.exe to C:\SD\oobabooga_windows\installer_files\miniconda_installer.exe % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 53.1M 100 53.1M 0 0 10.0M 0 0:00:05 0:00:05 --:--:-- 10.3M Installing Miniconda to C:\SD\oobabooga_windows\installer_files\conda Miniconda version: conda 22.11.1 Packages to install: Collecting package metadata (current_repodata.json): done Solving environment: done

Package Plan

environment location: C:\SD\oobabooga_windows\installer_files\env

added / updated specs:

The following packages will be downloaded:

package                    |            build
---------------------------|-----------------
ca-certificates-2023.05.30 |       haa95532_0         120 KB
libffi-3.4.4               |       hd77b12b_0         113 KB
openssl-3.0.8              |       h2bbff1b_0         7.4 MB
pip-23.1.2                 |  py310haa95532_0         2.8 MB
python-3.10.11             |       he1021f5_3        15.8 MB
setuptools-67.8.0          |  py310haa95532_0         1.1 MB
sqlite-3.41.2              |       h2bbff1b_0         894 KB
tzdata-2023c               |       h04d1e81_0         116 KB
wheel-0.38.4               |  py310haa95532_0          83 KB
xz-5.4.2                   |       h8cc25b3_0         592 KB
------------------------------------------------------------
                                       Total:        28.9 MB

The following NEW packages will be INSTALLED:

bzip2 pkgs/main/win-64::bzip2-1.0.8-he774522_0 ca-certificates pkgs/main/win-64::ca-certificates-2023.05.30-haa95532_0 libffi pkgs/main/win-64::libffi-3.4.4-hd77b12b_0 openssl pkgs/main/win-64::openssl-3.0.8-h2bbff1b_0 pip pkgs/main/win-64::pip-23.1.2-py310haa95532_0 python pkgs/main/win-64::python-3.10.11-he1021f5_3 setuptools pkgs/main/win-64::setuptools-67.8.0-py310haa95532_0 sqlite pkgs/main/win-64::sqlite-3.41.2-h2bbff1b_0 tk pkgs/main/win-64::tk-8.6.12-h2bbff1b_0 tzdata pkgs/main/noarch::tzdata-2023c-h04d1e81_0 vc pkgs/main/win-64::vc-14.2-h21ff451_1 vs2015_runtime pkgs/main/win-64::vs2015_runtime-14.27.29016-h5e58377_2 wheel pkgs/main/win-64::wheel-0.38.4-py310haa95532_0 xz pkgs/main/win-64::xz-5.4.2-h8cc25b3_0 zlib pkgs/main/win-64::zlib-1.2.13-h8cc25b3_0

Downloading and Extracting Packages

Preparing transaction: done Verifying transaction: done Executing transaction: done #

To activate this environment, use

#

$ conda activate C:\SD\oobabooga_windows\installer_files\env

#

To deactivate an active environment, use

#

$ conda deactivate

What is your GPU

A) NVIDIA B) AMD C) Apple M Series D) None (I want to run in CPU mode)

Input> A Collecting package metadata (current_repodata.json): done Solving environment: failed with initial frozen solve. Retrying with flexible solve. Collecting package metadata (repodata.json): done Solving environment: done

==> WARNING: A newer version of conda exists. <== current version: 23.1.0 latest version: 23.5.0

Please update conda by running

$ conda update -n base -c defaults conda

Or to minimize the number of packages updated during conda update use

 conda install conda=23.5.0

Package Plan

environment location: C:\SD\oobabooga_windows\installer_files\env

added / updated specs:

The following packages will be downloaded:

package                    |            build
---------------------------|-----------------
blas-1.0                   |              mkl           6 KB
certifi-2023.5.7           |  py310haa95532_0         153 KB
cryptography-39.0.1        |  py310h21b164f_2         1.0 MB
cuda-cccl-11.7.58          |                0         1.2 MB  nvidia/label/cuda-11.7.0
cuda-command-line-tools-11.7.0|                0           1 KB  nvidia/label/cuda-11.7.0
cuda-compiler-11.7.0       |                0           1 KB  nvidia/label/cuda-11.7.0
cuda-cudart-11.7.60        |                0         1.4 MB  nvidia/label/cuda-11.7.0
cuda-cudart-dev-11.7.60    |                0         695 KB  nvidia/label/cuda-11.7.0
cuda-cuobjdump-11.7.50     |                0         2.5 MB  nvidia/label/cuda-11.7.0
cuda-cupti-11.7.50         |                0        10.2 MB  nvidia/label/cuda-11.7.0
cuda-cuxxfilt-11.7.50      |                0         165 KB  nvidia/label/cuda-11.7.0
cuda-documentation-11.7.50 |                0          91 KB  nvidia/label/cuda-11.7.0
cuda-libraries-11.7.0      |                0           1 KB  nvidia/label/cuda-11.7.0
cuda-libraries-dev-11.7.0  |                0           1 KB  nvidia/label/cuda-11.7.0
cuda-memcheck-11.7.50      |                0         183 KB  nvidia/label/cuda-11.7.0
cuda-nsight-compute-11.7.0 |                0           1 KB  nvidia/label/cuda-11.7.0
cuda-nvcc-11.7.64          |                0        44.2 MB  nvidia/label/cuda-11.7.0
cuda-nvdisasm-11.7.50      |                0        31.6 MB  nvidia/label/cuda-11.7.0
cuda-nvml-dev-11.7.50      |                0          85 KB  nvidia/label/cuda-11.7.0
cuda-nvprof-11.7.50        |                0         1.5 MB  nvidia/label/cuda-11.7.0
cuda-nvprune-11.7.50       |                0         152 KB  nvidia/label/cuda-11.7.0
cuda-nvrtc-11.7.50         |                0        71.9 MB  nvidia/label/cuda-11.7.0
cuda-nvrtc-dev-11.7.50     |                0        14.3 MB  nvidia/label/cuda-11.7.0
cuda-nvtx-11.7.50          |                0          43 KB  nvidia/label/cuda-11.7.0
cuda-nvvp-11.7.50          |                0       113.6 MB  nvidia/label/cuda-11.7.0
cuda-runtime-11.7.0        |                0           1 KB  nvidia/label/cuda-11.7.0
cuda-sanitizer-api-11.7.50 |                0        12.6 MB  nvidia/label/cuda-11.7.0
cuda-toolkit-11.7.0        |                0           1 KB  nvidia/label/cuda-11.7.0
cuda-tools-11.7.0          |                0           1 KB  nvidia/label/cuda-11.7.0
cuda-visual-tools-11.7.0   |                0           1 KB  nvidia/label/cuda-11.7.0
filelock-3.9.0             |  py310haa95532_0          19 KB
giflib-5.2.1               |       h8cc25b3_3          88 KB
git-2.40.1                 |       haa95532_0        69.2 MB
intel-openmp-2023.1.0      |   h59b6b97_46319         2.7 MB
jinja2-3.1.2               |  py310haa95532_0         215 KB
jpeg-9e                    |       h2bbff1b_1         320 KB
lerc-3.0                   |       hd77b12b_0         120 KB
libcublas-11.10.1.25       |                0          24 KB  nvidia/label/cuda-11.7.0
libcublas-dev-11.10.1.25   |                0       286.3 MB  nvidia/label/cuda-11.7.0
libcufft-10.7.2.50         |                0           6 KB  nvidia/label/cuda-11.7.0
libcufft-dev-10.7.2.50     |                0       249.8 MB  nvidia/label/cuda-11.7.0
libcurand-10.2.10.50       |                0           3 KB  nvidia/label/cuda-11.7.0
libcurand-dev-10.2.10.50   |                0        49.1 MB  nvidia/label/cuda-11.7.0
libcusolver-11.3.5.50      |                0          29 KB  nvidia/label/cuda-11.7.0
libcusolver-dev-11.3.5.50  |                0        86.6 MB  nvidia/label/cuda-11.7.0
libcusparse-11.7.3.50      |                0          13 KB  nvidia/label/cuda-11.7.0
libcusparse-dev-11.7.3.50  |                0       147.6 MB  nvidia/label/cuda-11.7.0
libdeflate-1.17            |       h2bbff1b_0         151 KB
libnpp-11.7.3.21           |                0         286 KB  nvidia/label/cuda-11.7.0
libnpp-dev-11.7.3.21       |                0       114.3 MB  nvidia/label/cuda-11.7.0
libnvjpeg-11.7.2.34        |                0           4 KB  nvidia/label/cuda-11.7.0
libnvjpeg-dev-11.7.2.34    |                0         1.8 MB  nvidia/label/cuda-11.7.0
libpng-1.6.39              |       h8cc25b3_0         369 KB
libtiff-4.5.0              |       h6c2663c_2         1.2 MB
libuv-1.44.2               |       h2bbff1b_0         288 KB
libwebp-1.2.4              |       hbc33d0d_1          73 KB
libwebp-base-1.2.4         |       h2bbff1b_1         304 KB
markupsafe-2.1.1           |  py310h2bbff1b_0          26 KB
mkl-2023.1.0               |   h8bd8f75_46356       155.6 MB
mkl-service-2.4.0          |  py310h2bbff1b_1          44 KB
mkl_fft-1.3.6              |  py310h4ed8f06_1         157 KB
mkl_random-1.2.2           |  py310h4ed8f06_1         210 KB
mpmath-1.2.1               |  py310haa95532_0         779 KB
networkx-2.8.4             |  py310haa95532_1         2.6 MB
ninja-1.10.2               |       haa95532_5          14 KB
ninja-base-1.10.2          |       h6d14046_5         255 KB
nsight-compute-2022.2.0.13 |                0       336.2 MB  nvidia/label/cuda-11.7.0
numpy-1.24.3               |  py310h055cbcc_1          11 KB
numpy-base-1.24.3          |  py310h65a83cf_1         5.2 MB
pillow-9.4.0               |  py310hd77b12b_0        1007 KB
pyopenssl-23.0.0           |  py310haa95532_0          98 KB
pytorch-2.0.0              |py3.10_cuda11.7_cudnn8_0        1.17 GB  pytorch
pytorch-cuda-11.7          |       h16d0643_5           4 KB  pytorch
requests-2.29.0            |  py310haa95532_0          98 KB
sympy-1.11.1               |  py310haa95532_0        11.8 MB
tbb-2021.8.0               |       h59b6b97_0         149 KB
torchaudio-2.0.0           |      py310_cu117         5.7 MB  pytorch
torchvision-0.15.0         |      py310_cu117         7.7 MB  pytorch
typing_extensions-4.6.3    |  py310haa95532_0          56 KB
urllib3-1.26.16            |  py310haa95532_0         202 KB
zstd-1.5.5                 |       hd43e919_0         682 KB
------------------------------------------------------------
                                       Total:        2.97 GB

The following NEW packages will be INSTALLED:

blas pkgs/main/win-64::blas-1.0-mkl brotlipy pkgs/main/win-64::brotlipy-0.7.0-py310h2bbff1b_1002 certifi pkgs/main/win-64::certifi-2023.5.7-py310haa95532_0 cffi pkgs/main/win-64::cffi-1.15.1-py310h2bbff1b_3 charset-normalizer pkgs/main/noarch::charset-normalizer-2.0.4-pyhd3eb1b0_0 cryptography pkgs/main/win-64::cryptography-39.0.1-py310h21b164f_2 cuda-cccl nvidia/label/cuda-11.7.0/win-64::cuda-cccl-11.7.58-0 cuda-command-line~ nvidia/label/cuda-11.7.0/win-64::cuda-command-line-tools-11.7.0-0 cuda-compiler nvidia/label/cuda-11.7.0/win-64::cuda-compiler-11.7.0-0 cuda-cudart nvidia/label/cuda-11.7.0/win-64::cuda-cudart-11.7.60-0 cuda-cudart-dev nvidia/label/cuda-11.7.0/win-64::cuda-cudart-dev-11.7.60-0 cuda-cuobjdump nvidia/label/cuda-11.7.0/win-64::cuda-cuobjdump-11.7.50-0 cuda-cupti nvidia/label/cuda-11.7.0/win-64::cuda-cupti-11.7.50-0 cuda-cuxxfilt nvidia/label/cuda-11.7.0/win-64::cuda-cuxxfilt-11.7.50-0 cuda-documentation nvidia/label/cuda-11.7.0/win-64::cuda-documentation-11.7.50-0 cuda-libraries nvidia/label/cuda-11.7.0/win-64::cuda-libraries-11.7.0-0 cuda-libraries-dev nvidia/label/cuda-11.7.0/win-64::cuda-libraries-dev-11.7.0-0 cuda-memcheck nvidia/label/cuda-11.7.0/win-64::cuda-memcheck-11.7.50-0 cuda-nsight-compu~ nvidia/label/cuda-11.7.0/win-64::cuda-nsight-compute-11.7.0-0 cuda-nvcc nvidia/label/cuda-11.7.0/win-64::cuda-nvcc-11.7.64-0 cuda-nvdisasm nvidia/label/cuda-11.7.0/win-64::cuda-nvdisasm-11.7.50-0 cuda-nvml-dev nvidia/label/cuda-11.7.0/win-64::cuda-nvml-dev-11.7.50-0 cuda-nvprof nvidia/label/cuda-11.7.0/win-64::cuda-nvprof-11.7.50-0 cuda-nvprune nvidia/label/cuda-11.7.0/win-64::cuda-nvprune-11.7.50-0 cuda-nvrtc nvidia/label/cuda-11.7.0/win-64::cuda-nvrtc-11.7.50-0 cuda-nvrtc-dev nvidia/label/cuda-11.7.0/win-64::cuda-nvrtc-dev-11.7.50-0 cuda-nvtx nvidia/label/cuda-11.7.0/win-64::cuda-nvtx-11.7.50-0 cuda-nvvp nvidia/label/cuda-11.7.0/win-64::cuda-nvvp-11.7.50-0 cuda-runtime nvidia/label/cuda-11.7.0/win-64::cuda-runtime-11.7.0-0 cuda-sanitizer-api nvidia/label/cuda-11.7.0/win-64::cuda-sanitizer-api-11.7.50-0 cuda-toolkit nvidia/label/cuda-11.7.0/win-64::cuda-toolkit-11.7.0-0 cuda-tools nvidia/label/cuda-11.7.0/win-64::cuda-tools-11.7.0-0 cuda-visual-tools nvidia/label/cuda-11.7.0/win-64::cuda-visual-tools-11.7.0-0 filelock pkgs/main/win-64::filelock-3.9.0-py310haa95532_0 freetype pkgs/main/win-64::freetype-2.12.1-ha860e81_0 giflib pkgs/main/win-64::giflib-5.2.1-h8cc25b3_3 git pkgs/main/win-64::git-2.40.1-haa95532_0 idna pkgs/main/win-64::idna-3.4-py310haa95532_0 intel-openmp pkgs/main/win-64::intel-openmp-2023.1.0-h59b6b97_46319 jinja2 pkgs/main/win-64::jinja2-3.1.2-py310haa95532_0 jpeg pkgs/main/win-64::jpeg-9e-h2bbff1b_1 lerc pkgs/main/win-64::lerc-3.0-hd77b12b_0 libcublas nvidia/label/cuda-11.7.0/win-64::libcublas-11.10.1.25-0 libcublas-dev nvidia/label/cuda-11.7.0/win-64::libcublas-dev-11.10.1.25-0 libcufft nvidia/label/cuda-11.7.0/win-64::libcufft-10.7.2.50-0 libcufft-dev nvidia/label/cuda-11.7.0/win-64::libcufft-dev-10.7.2.50-0 libcurand nvidia/label/cuda-11.7.0/win-64::libcurand-10.2.10.50-0 libcurand-dev nvidia/label/cuda-11.7.0/win-64::libcurand-dev-10.2.10.50-0 libcusolver nvidia/label/cuda-11.7.0/win-64::libcusolver-11.3.5.50-0 libcusolver-dev nvidia/label/cuda-11.7.0/win-64::libcusolver-dev-11.3.5.50-0 libcusparse nvidia/label/cuda-11.7.0/win-64::libcusparse-11.7.3.50-0 libcusparse-dev nvidia/label/cuda-11.7.0/win-64::libcusparse-dev-11.7.3.50-0 libdeflate pkgs/main/win-64::libdeflate-1.17-h2bbff1b_0 libnpp nvidia/label/cuda-11.7.0/win-64::libnpp-11.7.3.21-0 libnpp-dev nvidia/label/cuda-11.7.0/win-64::libnpp-dev-11.7.3.21-0 libnvjpeg nvidia/label/cuda-11.7.0/win-64::libnvjpeg-11.7.2.34-0 libnvjpeg-dev nvidia/label/cuda-11.7.0/win-64::libnvjpeg-dev-11.7.2.34-0 libpng pkgs/main/win-64::libpng-1.6.39-h8cc25b3_0 libtiff pkgs/main/win-64::libtiff-4.5.0-h6c2663c_2 libuv pkgs/main/win-64::libuv-1.44.2-h2bbff1b_0 libwebp pkgs/main/win-64::libwebp-1.2.4-hbc33d0d_1 libwebp-base pkgs/main/win-64::libwebp-base-1.2.4-h2bbff1b_1 lz4-c pkgs/main/win-64::lz4-c-1.9.4-h2bbff1b_0 markupsafe pkgs/main/win-64::markupsafe-2.1.1-py310h2bbff1b_0 mkl pkgs/main/win-64::mkl-2023.1.0-h8bd8f75_46356 mkl-service pkgs/main/win-64::mkl-service-2.4.0-py310h2bbff1b_1 mkl_fft pkgs/main/win-64::mkl_fft-1.3.6-py310h4ed8f06_1 mkl_random pkgs/main/win-64::mkl_random-1.2.2-py310h4ed8f06_1 mpmath pkgs/main/win-64::mpmath-1.2.1-py310haa95532_0 networkx pkgs/main/win-64::networkx-2.8.4-py310haa95532_1 ninja pkgs/main/win-64::ninja-1.10.2-haa95532_5 ninja-base pkgs/main/win-64::ninja-base-1.10.2-h6d14046_5 nsight-compute nvidia/label/cuda-11.7.0/win-64::nsight-compute-2022.2.0.13-0 numpy pkgs/main/win-64::numpy-1.24.3-py310h055cbcc_1 numpy-base pkgs/main/win-64::numpy-base-1.24.3-py310h65a83cf_1 pillow pkgs/main/win-64::pillow-9.4.0-py310hd77b12b_0 pycparser pkgs/main/noarch::pycparser-2.21-pyhd3eb1b0_0 pyopenssl pkgs/main/win-64::pyopenssl-23.0.0-py310haa95532_0 pysocks pkgs/main/win-64::pysocks-1.7.1-py310haa95532_0 pytorch pytorch/win-64::pytorch-2.0.0-py3.10_cuda11.7_cudnn8_0 pytorch-cuda pytorch/win-64::pytorch-cuda-11.7-h16d0643_5 pytorch-mutex pytorch/noarch::pytorch-mutex-1.0-cuda requests pkgs/main/win-64::requests-2.29.0-py310haa95532_0 sympy pkgs/main/win-64::sympy-1.11.1-py310haa95532_0 tbb pkgs/main/win-64::tbb-2021.8.0-h59b6b97_0 torchaudio pytorch/win-64::torchaudio-2.0.0-py310_cu117 torchvision pytorch/win-64::torchvision-0.15.0-py310_cu117 typing_extensions pkgs/main/win-64::typing_extensions-4.6.3-py310haa95532_0 urllib3 pkgs/main/win-64::urllib3-1.26.16-py310haa95532_0 win_inet_pton pkgs/main/win-64::win_inet_pton-1.1.0-py310haa95532_0 zstd pkgs/main/win-64::zstd-1.5.5-hd43e919_0

Downloading and Extracting Packages

Preparing transaction: done Verifying transaction: done Executing transaction: done Cloning into 'text-generation-webui'... remote: Enumerating objects: 8385, done. remote: Counting objects: 100% (601/601), done. remote: Compressing objects: 100% (252/252), done. remote: Total 8385 (delta 380), reused 527 (delta 349), pack-reused 7784 Receiving objects: 100% (8385/8385), 2.90 MiB | 3.69 MiB/s, done. Resolving deltas: 100% (5585/5585), done. Already up to date. Collecting git+https://github.com/huggingface/peft@e45529b149c7f91ec1d4d82a5a152ef56c56cb94 (from -r requirements.txt (line 19)) Cloning https://github.com/huggingface/peft (to revision e45529b149c7f91ec1d4d82a5a152ef56c56cb94) to c:\sd\oobabooga_windows\installer_files\pip-req-build-9vi93ktr Running command git clone --filter=blob:none --quiet https://github.com/huggingface/peft 'C:\SD\oobabooga_windows\installer_files\pip-req-build-9vi93ktr' Running command git rev-parse -q --verify 'sha^e45529b149c7f91ec1d4d82a5a152ef56c56cb94' Running command git fetch -q https://github.com/huggingface/peft e45529b149c7f91ec1d4d82a5a152ef56c56cb94 Running command git checkout -q e45529b149c7f91ec1d4d82a5a152ef56c56cb94 Resolved https://github.com/huggingface/peft to commit e45529b149c7f91ec1d4d82a5a152ef56c56cb94 Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Ignoring bitsandbytes: markers 'platform_system != "Windows"' don't match your environment Collecting bitsandbytes==0.39.0 (from -r requirements.txt (line 21)) Using cached https://github.com/jllllll/bitsandbytes-windows-webui/raw/main/bitsandbytes-0.39.0-py3-none-any.whl (85.3 MB) Ignoring llama-cpp-python: markers 'platform_system != "Windows"' don't match your environment Collecting llama-cpp-python==0.1.62 (from -r requirements.txt (line 23)) Downloading https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.62/llama_cpp_python-0.1.62-cp310-cp310-win_amd64.whl (459 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 459.5/459.5 kB 1.8 MB/s eta 0:00:00 Collecting auto-gptq==0.2.2+cu117 (from -r requirements.txt (line 24)) Downloading https://github.com/PanQiWei/AutoGPTQ/releases/download/v0.2.2/auto_gptq-0.2.2+cu117-cp310-cp310-win_amd64.whl (596 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 596.5/596.5 kB 12.7 MB/s eta 0:00:00 Ignoring auto-gptq: markers 'platform_system == "Linux"' don't match your environment Collecting accelerate==0.20.3 (from -r requirements.txt (line 1)) Using cached accelerate-0.20.3-py3-none-any.whl (227 kB) Collecting colorama (from -r requirements.txt (line 2)) Using cached colorama-0.4.6-py2.py3-none-any.whl (25 kB) Collecting datasets (from -r requirements.txt (line 3)) Using cached datasets-2.12.0-py3-none-any.whl (474 kB) Collecting einops (from -r requirements.txt (line 4)) Using cached einops-0.6.1-py3-none-any.whl (42 kB) Collecting flexgen==0.1.7 (from -r requirements.txt (line 5)) Using cached flexgen-0.1.7-py3-none-any.whl (50 kB) Collecting gradio_client==0.2.5 (from -r requirements.txt (line 6)) Using cached gradio_client-0.2.5-py3-none-any.whl (288 kB) Collecting gradio==3.33.1 (from -r requirements.txt (line 7)) Using cached gradio-3.33.1-py3-none-any.whl (20.0 MB) Collecting markdown (from -r requirements.txt (line 8)) Using cached Markdown-3.4.3-py3-none-any.whl (93 kB) Requirement already satisfied: numpy in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from -r requirements.txt (line 9)) (1.24.3) Collecting pandas (from -r requirements.txt (line 10)) Using cached pandas-2.0.2-cp310-cp310-win_amd64.whl (10.7 MB) Collecting Pillow>=9.5.0 (from -r requirements.txt (line 11)) Using cached Pillow-9.5.0-cp310-cp310-win_amd64.whl (2.5 MB) Collecting pyyaml (from -r requirements.txt (line 12)) Using cached PyYAML-6.0-cp310-cp310-win_amd64.whl (151 kB) Requirement already satisfied: requests in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from -r requirements.txt (line 13)) (2.29.0) Collecting requests (from -r requirements.txt (line 13)) Using cached requests-2.31.0-py3-none-any.whl (62 kB) Collecting safetensors==0.3.1 (from -r requirements.txt (line 14)) Using cached safetensors-0.3.1-cp310-cp310-win_amd64.whl (263 kB) Collecting sentencepiece (from -r requirements.txt (line 15)) Using cached sentencepiece-0.1.99-cp310-cp310-win_amd64.whl (977 kB) Collecting tqdm (from -r requirements.txt (line 16)) Using cached tqdm-4.65.0-py3-none-any.whl (77 kB) Collecting scipy (from -r requirements.txt (line 17)) Using cached scipy-1.10.1-cp310-cp310-win_amd64.whl (42.5 MB) Collecting transformers==4.30.0 (from -r requirements.txt (line 18)) Using cached transformers-4.30.0-py3-none-any.whl (7.2 MB) Collecting packaging>=20.0 (from accelerate==0.20.3->-r requirements.txt (line 1)) Using cached packaging-23.1-py3-none-any.whl (48 kB) Collecting psutil (from accelerate==0.20.3->-r requirements.txt (line 1)) Using cached psutil-5.9.5-cp36-abi3-win_amd64.whl (255 kB) Requirement already satisfied: torch>=1.6.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from accelerate==0.20.3->-r requirements.txt (line 1)) (2.0.0) Collecting pulp (from flexgen==0.1.7->-r requirements.txt (line 5)) Using cached PuLP-2.7.0-py3-none-any.whl (14.3 MB) Collecting attrs (from flexgen==0.1.7->-r requirements.txt (line 5)) Using cached attrs-23.1.0-py3-none-any.whl (61 kB) Collecting fsspec (from gradio_client==0.2.5->-r requirements.txt (line 6)) Using cached fsspec-2023.6.0-py3-none-any.whl (163 kB) Collecting httpx (from gradio_client==0.2.5->-r requirements.txt (line 6)) Using cached httpx-0.24.1-py3-none-any.whl (75 kB) Collecting huggingface-hub>=0.13.0 (from gradio_client==0.2.5->-r requirements.txt (line 6)) Using cached huggingface_hub-0.15.1-py3-none-any.whl (236 kB) Requirement already satisfied: typing-extensions in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from gradio_client==0.2.5->-r requirements.txt (line 6)) (4.6.3) Collecting websockets (from gradio_client==0.2.5->-r requirements.txt (line 6)) Using cached websockets-11.0.3-cp310-cp310-win_amd64.whl (124 kB) Collecting aiofiles (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached aiofiles-23.1.0-py3-none-any.whl (14 kB) Collecting aiohttp (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached aiohttp-3.8.4-cp310-cp310-win_amd64.whl (319 kB) Collecting altair>=4.2.0 (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached altair-5.0.1-py3-none-any.whl (471 kB) Collecting fastapi (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached fastapi-0.97.0-py3-none-any.whl (56 kB) Collecting ffmpy (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached ffmpy-0.3.0-py3-none-any.whl Requirement already satisfied: jinja2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from gradio==3.33.1->-r requirements.txt (line 7)) (3.1.2) Collecting markdown-it-py[linkify]>=2.0.0 (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached markdown_it_py-3.0.0-py3-none-any.whl (87 kB) Requirement already satisfied: markupsafe in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from gradio==3.33.1->-r requirements.txt (line 7)) (2.1.1) Collecting matplotlib (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached matplotlib-3.7.1-cp310-cp310-win_amd64.whl (7.6 MB) Collecting mdit-py-plugins<=0.3.3 (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached mdit_py_plugins-0.3.3-py3-none-any.whl (50 kB) Collecting orjson (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached orjson-3.9.1-cp310-none-win_amd64.whl (191 kB) Collecting pydantic (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached pydantic-1.10.9-cp310-cp310-win_amd64.whl (2.1 MB) Collecting pydub (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached pydub-0.25.1-py2.py3-none-any.whl (32 kB) Collecting pygments>=2.12.0 (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached Pygments-2.15.1-py3-none-any.whl (1.1 MB) Collecting python-multipart (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached python_multipart-0.0.6-py3-none-any.whl (45 kB) Collecting semantic-version (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached semantic_version-2.10.0-py2.py3-none-any.whl (15 kB) Collecting uvicorn>=0.14.0 (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached uvicorn-0.22.0-py3-none-any.whl (58 kB) Requirement already satisfied: filelock in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from transformers==4.30.0->-r requirements.txt (line 18)) (3.9.0) Collecting regex!=2019.12.17 (from transformers==4.30.0->-r requirements.txt (line 18)) Using cached regex-2023.6.3-cp310-cp310-win_amd64.whl (268 kB) Collecting tokenizers!=0.11.3,<0.14,>=0.11.1 (from transformers==4.30.0->-r requirements.txt (line 18)) Using cached tokenizers-0.13.3-cp310-cp310-win_amd64.whl (3.5 MB) Collecting pyarrow>=8.0.0 (from datasets->-r requirements.txt (line 3)) Using cached pyarrow-12.0.1-cp310-cp310-win_amd64.whl (21.5 MB) Collecting dill<0.3.7,>=0.3.0 (from datasets->-r requirements.txt (line 3)) Using cached dill-0.3.6-py3-none-any.whl (110 kB) Collecting xxhash (from datasets->-r requirements.txt (line 3)) Using cached xxhash-3.2.0-cp310-cp310-win_amd64.whl (30 kB) Collecting multiprocess (from datasets->-r requirements.txt (line 3)) Using cached multiprocess-0.70.14-py310-none-any.whl (134 kB) Collecting responses<0.19 (from datasets->-r requirements.txt (line 3)) Using cached responses-0.18.0-py3-none-any.whl (38 kB) Collecting python-dateutil>=2.8.2 (from pandas->-r requirements.txt (line 10)) Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) Collecting pytz>=2020.1 (from pandas->-r requirements.txt (line 10)) Using cached pytz-2023.3-py2.py3-none-any.whl (502 kB) Collecting tzdata>=2022.1 (from pandas->-r requirements.txt (line 10)) Using cached tzdata-2023.3-py2.py3-none-any.whl (341 kB) Requirement already satisfied: charset-normalizer<4,>=2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->-r requirements.txt (line 13)) (2.0.4) Requirement already satisfied: idna<4,>=2.5 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->-r requirements.txt (line 13)) (3.4) Requirement already satisfied: urllib3<3,>=1.21.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->-r requirements.txt (line 13)) (1.26.16) Requirement already satisfied: certifi>=2017.4.17 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->-r requirements.txt (line 13)) (2023.5.7) Collecting diskcache>=5.6.1 (from llama-cpp-python==0.1.62->-r requirements.txt (line 23)) Using cached diskcache-5.6.1-py3-none-any.whl (45 kB) Collecting rouge (from auto-gptq==0.2.2+cu117->-r requirements.txt (line 24)) Using cached rouge-1.0.1-py3-none-any.whl (13 kB) Collecting jsonschema>=3.0 (from altair>=4.2.0->gradio==3.33.1->-r requirements.txt (line 7)) Using cached jsonschema-4.17.3-py3-none-any.whl (90 kB) Collecting toolz (from altair>=4.2.0->gradio==3.33.1->-r requirements.txt (line 7)) Using cached toolz-0.12.0-py3-none-any.whl (55 kB) Collecting multidict<7.0,>=4.5 (from aiohttp->gradio==3.33.1->-r requirements.txt (line 7)) Using cached multidict-6.0.4-cp310-cp310-win_amd64.whl (28 kB) Collecting async-timeout<5.0,>=4.0.0a3 (from aiohttp->gradio==3.33.1->-r requirements.txt (line 7)) Using cached async_timeout-4.0.2-py3-none-any.whl (5.8 kB) Collecting yarl<2.0,>=1.0 (from aiohttp->gradio==3.33.1->-r requirements.txt (line 7)) Using cached yarl-1.9.2-cp310-cp310-win_amd64.whl (61 kB) Collecting frozenlist>=1.1.1 (from aiohttp->gradio==3.33.1->-r requirements.txt (line 7)) Using cached frozenlist-1.3.3-cp310-cp310-win_amd64.whl (33 kB) Collecting aiosignal>=1.1.2 (from aiohttp->gradio==3.33.1->-r requirements.txt (line 7)) Using cached aiosignal-1.3.1-py3-none-any.whl (7.6 kB) Collecting mdurl~=0.1 (from markdown-it-py[linkify]>=2.0.0->gradio==3.33.1->-r requirements.txt (line 7)) Using cached mdurl-0.1.2-py3-none-any.whl (10.0 kB) Collecting linkify-it-py<3,>=1 (from markdown-it-py[linkify]>=2.0.0->gradio==3.33.1->-r requirements.txt (line 7)) Using cached linkify_it_py-2.0.2-py3-none-any.whl (19 kB) INFO: pip is looking at multiple versions of mdit-py-plugins to determine which version is compatible with other requirements. This could take a while. Collecting mdit-py-plugins<=0.3.3 (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached mdit_py_plugins-0.3.2-py3-none-any.whl (50 kB) Using cached mdit_py_plugins-0.3.1-py3-none-any.whl (46 kB) Using cached mdit_py_plugins-0.3.0-py3-none-any.whl (43 kB) Using cached mdit_py_plugins-0.2.8-py3-none-any.whl (41 kB) Using cached mdit_py_plugins-0.2.7-py3-none-any.whl (41 kB) Using cached mdit_py_plugins-0.2.6-py3-none-any.whl (39 kB) Using cached mdit_py_plugins-0.2.5-py3-none-any.whl (39 kB) INFO: pip is looking at multiple versions of mdit-py-plugins to determine which version is compatible with other requirements. This could take a while. Using cached mdit_py_plugins-0.2.4-py3-none-any.whl (39 kB) Using cached mdit_py_plugins-0.2.3-py3-none-any.whl (39 kB) Using cached mdit_py_plugins-0.2.2-py3-none-any.whl (39 kB) Using cached mdit_py_plugins-0.2.1-py3-none-any.whl (38 kB) Using cached mdit_py_plugins-0.2.0-py3-none-any.whl (38 kB) INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C. Using cached mdit_py_plugins-0.1.0-py3-none-any.whl (37 kB) Collecting markdown-it-py[linkify]>=2.0.0 (from gradio==3.33.1->-r requirements.txt (line 7)) Using cached markdown_it_py-2.2.0-py3-none-any.whl (84 kB) Collecting six>=1.5 (from python-dateutil>=2.8.2->pandas->-r requirements.txt (line 10)) Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Requirement already satisfied: sympy in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch>=1.6.0->accelerate==0.20.3->-r requirements.txt (line 1)) (1.11.1) Requirement already satisfied: networkx in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch>=1.6.0->accelerate==0.20.3->-r requirements.txt (line 1)) (2.8.4) Collecting click>=7.0 (from uvicorn>=0.14.0->gradio==3.33.1->-r requirements.txt (line 7)) Using cached click-8.1.3-py3-none-any.whl (96 kB) Collecting h11>=0.8 (from uvicorn>=0.14.0->gradio==3.33.1->-r requirements.txt (line 7)) Using cached h11-0.14.0-py3-none-any.whl (58 kB) Collecting starlette<0.28.0,>=0.27.0 (from fastapi->gradio==3.33.1->-r requirements.txt (line 7)) Using cached starlette-0.27.0-py3-none-any.whl (66 kB) Collecting httpcore<0.18.0,>=0.15.0 (from httpx->gradio_client==0.2.5->-r requirements.txt (line 6)) Using cached httpcore-0.17.2-py3-none-any.whl (72 kB) Collecting sniffio (from httpx->gradio_client==0.2.5->-r requirements.txt (line 6)) Using cached sniffio-1.3.0-py3-none-any.whl (10 kB) Collecting contourpy>=1.0.1 (from matplotlib->gradio==3.33.1->-r requirements.txt (line 7)) Using cached contourpy-1.1.0-cp310-cp310-win_amd64.whl (470 kB) Collecting cycler>=0.10 (from matplotlib->gradio==3.33.1->-r requirements.txt (line 7)) Using cached cycler-0.11.0-py3-none-any.whl (6.4 kB) Collecting fonttools>=4.22.0 (from matplotlib->gradio==3.33.1->-r requirements.txt (line 7)) Using cached fonttools-4.40.0-cp310-cp310-win_amd64.whl (1.9 MB) Collecting kiwisolver>=1.0.1 (from matplotlib->gradio==3.33.1->-r requirements.txt (line 7)) Using cached kiwisolver-1.4.4-cp310-cp310-win_amd64.whl (55 kB) Collecting pyparsing>=2.3.1 (from matplotlib->gradio==3.33.1->-r requirements.txt (line 7)) Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB) Collecting anyio<5.0,>=3.0 (from httpcore<0.18.0,>=0.15.0->httpx->gradio_client==0.2.5->-r requirements.txt (line 6)) Using cached anyio-3.7.0-py3-none-any.whl (80 kB) Collecting pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 (from jsonschema>=3.0->altair>=4.2.0->gradio==3.33.1->-r requirements.txt (line 7)) Using cached pyrsistent-0.19.3-cp310-cp310-win_amd64.whl (62 kB) Collecting uc-micro-py (from linkify-it-py<3,>=1->markdown-it-py[linkify]>=2.0.0->gradio==3.33.1->-r requirements.txt (line 7)) Using cached uc_micro_py-1.0.2-py3-none-any.whl (6.2 kB) Requirement already satisfied: mpmath>=0.19 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sympy->torch>=1.6.0->accelerate==0.20.3->-r requirements.txt (line 1)) (1.2.1) Collecting exceptiongroup (from anyio<5.0,>=3.0->httpcore<0.18.0,>=0.15.0->httpx->gradio_client==0.2.5->-r requirements.txt (line 6)) Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB) Building wheels for collected packages: peft Building wheel for peft (pyproject.toml) ... done Created wheel for peft: filename=peft-0.4.0.dev0-py3-none-any.whl size=58605 sha256=ba7701070a3efcde16b45297a23a1dcbfa941ae84f79c6fda4ccff66b88428be Stored in directory: c:\users\admin\appdata\local\pip\cache\wheels\75\5f\15\f22ec8074767da800560c87bf2e22117ba2bcacf96f442e920 Successfully built peft Installing collected packages: tokenizers, sentencepiece, safetensors, pytz, pydub, pulp, ffmpy, xxhash, websockets, uc-micro-py, tzdata, toolz, sniffio, six, semantic-version, scipy, requests, regex, pyyaml, python-multipart, pyrsistent, pyparsing, pygments, pydantic, pyarrow, psutil, Pillow, packaging, orjson, multidict, mdurl, markdown, kiwisolver, h11, fsspec, frozenlist, fonttools, exceptiongroup, einops, diskcache, dill, cycler, contourpy, colorama, attrs, async-timeout, aiofiles, yarl, tqdm, rouge, responses, python-dateutil, multiprocess, markdown-it-py, llama-cpp-python, linkify-it-py, jsonschema, click, bitsandbytes, anyio, aiosignal, uvicorn, starlette, pandas, mdit-py-plugins, matplotlib, huggingface-hub, httpcore, aiohttp, accelerate, transformers, httpx, fastapi, altair, peft, gradio_client, flexgen, datasets, gradio, auto-gptq Attempting uninstall: requests Found existing installation: requests 2.29.0 Uninstalling requests-2.29.0: Successfully uninstalled requests-2.29.0 Attempting uninstall: Pillow Found existing installation: Pillow 9.4.0 Uninstalling Pillow-9.4.0: Successfully uninstalled Pillow-9.4.0 Successfully installed Pillow-9.5.0 accelerate-0.20.3 aiofiles-23.1.0 aiohttp-3.8.4 aiosignal-1.3.1 altair-5.0.1 anyio-3.7.0 async-timeout-4.0.2 attrs-23.1.0 auto-gptq-0.2.2+cu117 bitsandbytes-0.39.0 click-8.1.3 colorama-0.4.6 contourpy-1.1.0 cycler-0.11.0 datasets-2.12.0 dill-0.3.6 diskcache-5.6.1 einops-0.6.1 exceptiongroup-1.1.1 fastapi-0.97.0 ffmpy-0.3.0 flexgen-0.1.7 fonttools-4.40.0 frozenlist-1.3.3 fsspec-2023.6.0 gradio-3.33.1 gradio_client-0.2.5 h11-0.14.0 httpcore-0.17.2 httpx-0.24.1 huggingface-hub-0.15.1 jsonschema-4.17.3 kiwisolver-1.4.4 linkify-it-py-2.0.2 llama-cpp-python-0.1.62 markdown-3.4.3 markdown-it-py-2.2.0 matplotlib-3.7.1 mdit-py-plugins-0.3.3 mdurl-0.1.2 multidict-6.0.4 multiprocess-0.70.14 orjson-3.9.1 packaging-23.1 pandas-2.0.2 peft-0.4.0.dev0 psutil-5.9.5 pulp-2.7.0 pyarrow-12.0.1 pydantic-1.10.9 pydub-0.25.1 pygments-2.15.1 pyparsing-3.0.9 pyrsistent-0.19.3 python-dateutil-2.8.2 python-multipart-0.0.6 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 responses-0.18.0 rouge-1.0.1 safetensors-0.3.1 scipy-1.10.1 semantic-version-2.10.0 sentencepiece-0.1.99 six-1.16.0 sniffio-1.3.0 starlette-0.27.0 tokenizers-0.13.3 toolz-0.12.0 tqdm-4.65.0 transformers-4.30.0 tzdata-2023.3 uc-micro-py-1.0.2 uvicorn-0.22.0 websockets-11.0.3 xxhash-3.2.0 yarl-1.9.2 Collecting flask_cloudflared==0.0.12 (from -r extensions\api\requirements.txt (line 1)) Using cached flask_cloudflared-0.0.12-py3-none-any.whl (6.3 kB) Collecting websockets==11.0.2 (from -r extensions\api\requirements.txt (line 2)) Using cached websockets-11.0.2-cp310-cp310-win_amd64.whl (124 kB) Collecting Flask>=0.8 (from flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) Using cached Flask-2.3.2-py3-none-any.whl (96 kB) Requirement already satisfied: requests in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (2.31.0) Collecting Werkzeug>=2.3.3 (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) Using cached Werkzeug-2.3.6-py3-none-any.whl (242 kB) Requirement already satisfied: Jinja2>=3.1.2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (3.1.2) Collecting itsdangerous>=2.1.2 (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) Using cached itsdangerous-2.1.2-py3-none-any.whl (15 kB) Requirement already satisfied: click>=8.1.3 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (8.1.3) Collecting blinker>=1.6.2 (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) Using cached blinker-1.6.2-py3-none-any.whl (13 kB) Requirement already satisfied: charset-normalizer<4,>=2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (2.0.4) Requirement already satisfied: idna<4,>=2.5 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (3.4) Requirement already satisfied: urllib3<3,>=1.21.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (1.26.16) Requirement already satisfied: certifi>=2017.4.17 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (2023.5.7) Requirement already satisfied: colorama in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from click>=8.1.3->Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (0.4.6) Requirement already satisfied: MarkupSafe>=2.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Jinja2>=3.1.2->Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\api\requirements.txt (line 1)) (2.1.1) Installing collected packages: Werkzeug, websockets, itsdangerous, blinker, Flask, flask_cloudflared Attempting uninstall: websockets Found existing installation: websockets 11.0.3 Uninstalling websockets-11.0.3: Successfully uninstalled websockets-11.0.3 Successfully installed Flask-2.3.2 Werkzeug-2.3.6 blinker-1.6.2 flask_cloudflared-0.0.12 itsdangerous-2.1.2 websockets-11.0.2 Collecting elevenlabs==0.2. (from -r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached elevenlabs-0.2.18-py3-none-any.whl (14 kB) Requirement already satisfied: pydantic>=1.10 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (1.10.9) Collecting ipython>=7.0 (from elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached ipython-8.14.0-py3-none-any.whl (798 kB) Requirement already satisfied: requests>=2.20 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (2.31.0) Collecting backcall (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached backcall-0.2.0-py2.py3-none-any.whl (11 kB) Collecting decorator (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached decorator-5.1.1-py3-none-any.whl (9.1 kB) Collecting jedi>=0.16 (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached jedi-0.18.2-py2.py3-none-any.whl (1.6 MB) Collecting matplotlib-inline (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached matplotlib_inline-0.1.6-py3-none-any.whl (9.4 kB) Collecting pickleshare (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached pickleshare-0.7.5-py2.py3-none-any.whl (6.9 kB) Collecting prompt-toolkit!=3.0.37,<3.1.0,>=3.0.30 (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached prompt_toolkit-3.0.38-py3-none-any.whl (385 kB) Requirement already satisfied: pygments>=2.4.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (2.15.1) Collecting stack-data (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached stack_data-0.6.2-py3-none-any.whl (24 kB) Collecting traitlets>=5 (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached traitlets-5.9.0-py3-none-any.whl (117 kB) Requirement already satisfied: colorama in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (0.4.6) Requirement already satisfied: typing-extensions>=4.2.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from pydantic>=1.10->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (4.6.3) Requirement already satisfied: charset-normalizer<4,>=2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests>=2.20->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (2.0.4) Requirement already satisfied: idna<4,>=2.5 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests>=2.20->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (3.4) Requirement already satisfied: urllib3<3,>=1.21.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests>=2.20->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (1.26.16) Requirement already satisfied: certifi>=2017.4.17 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests>=2.20->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (2023.5.7) Collecting parso<0.9.0,>=0.8.0 (from jedi>=0.16->ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached parso-0.8.3-py2.py3-none-any.whl (100 kB) Collecting wcwidth (from prompt-toolkit!=3.0.37,<3.1.0,>=3.0.30->ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached wcwidth-0.2.6-py2.py3-none-any.whl (29 kB) Collecting executing>=1.2.0 (from stack-data->ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached executing-1.2.0-py2.py3-none-any.whl (24 kB) Collecting asttokens>=2.1.0 (from stack-data->ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached asttokens-2.2.1-py2.py3-none-any.whl (26 kB) Collecting pure-eval (from stack-data->ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) Using cached pure_eval-0.2.2-py3-none-any.whl (11 kB) Requirement already satisfied: six in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from asttokens>=2.1.0->stack-data->ipython>=7.0->elevenlabs==0.2.->-r extensions\elevenlabs_tts\requirements.txt (line 1)) (1.16.0) Installing collected packages: wcwidth, pure-eval, pickleshare, executing, backcall, traitlets, prompt-toolkit, parso, decorator, asttokens, stack-data, matplotlib-inline, jedi, ipython, elevenlabs Successfully installed asttokens-2.2.1 backcall-0.2.0 decorator-5.1.1 elevenlabs-0.2.18 executing-1.2.0 ipython-8.14.0 jedi-0.18.2 matplotlib-inline-0.1.6 parso-0.8.3 pickleshare-0.7.5 prompt-toolkit-3.0.38 pure-eval-0.2.2 stack-data-0.6.2 traitlets-5.9.0 wcwidth-0.2.6 Collecting deep-translator==1.9.2 (from -r extensions\google_translate\requirements.txt (line 1)) Using cached deep_translator-1.9.2-py3-none-any.whl (30 kB) Collecting beautifulsoup4<5.0.0,>=4.9.1 (from deep-translator==1.9.2->-r extensions\google_translate\requirements.txt (line 1)) Using cached beautifulsoup4-4.12.2-py3-none-any.whl (142 kB) Requirement already satisfied: requests<3.0.0,>=2.23.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from deep-translator==1.9.2->-r extensions\google_translate\requirements.txt (line 1)) (2.31.0) Collecting soupsieve>1.2 (from beautifulsoup4<5.0.0,>=4.9.1->deep-translator==1.9.2->-r extensions\google_translate\requirements.txt (line 1)) Using cached soupsieve-2.4.1-py3-none-any.whl (36 kB) Requirement already satisfied: charset-normalizer<4,>=2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests<3.0.0,>=2.23.0->deep-translator==1.9.2->-r extensions\google_translate\requirements.txt (line 1)) (2.0.4) Requirement already satisfied: idna<4,>=2.5 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests<3.0.0,>=2.23.0->deep-translator==1.9.2->-r extensions\google_translate\requirements.txt (line 1)) (3.4) Requirement already satisfied: urllib3<3,>=1.21.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests<3.0.0,>=2.23.0->deep-translator==1.9.2->-r extensions\google_translate\requirements.txt (line 1)) (1.26.16) Requirement already satisfied: certifi>=2017.4.17 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests<3.0.0,>=2.23.0->deep-translator==1.9.2->-r extensions\google_translate\requirements.txt (line 1)) (2023.5.7) Installing collected packages: soupsieve, beautifulsoup4, deep-translator Successfully installed beautifulsoup4-4.12.2 deep-translator-1.9.2 soupsieve-2.4.1 Collecting ngrok==0. (from -r extensions\ngrok\requirements.txt (line 1)) Using cached ngrok-0.8.1-cp37-abi3-win_amd64.whl (2.4 MB) Installing collected packages: ngrok Successfully installed ngrok-0.8.1 Requirement already satisfied: flask_cloudflared==0.0.12 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from -r extensions\openai\requirements.txt (line 1)) (0.0.12) Collecting sentence-transformers (from -r extensions\openai\requirements.txt (line 2)) Using cached sentence_transformers-2.2.2-py3-none-any.whl Requirement already satisfied: Flask>=0.8 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (2.3.2) Requirement already satisfied: requests in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (2.31.0) Requirement already satisfied: transformers<5.0.0,>=4.6.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (4.30.0) Requirement already satisfied: tqdm in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (4.65.0) Requirement already satisfied: torch>=1.6.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (2.0.0) Requirement already satisfied: torchvision in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (0.15.0) Requirement already satisfied: numpy in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (1.24.3) Collecting scikit-learn (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) Using cached scikit_learn-1.2.2-cp310-cp310-win_amd64.whl (8.3 MB) Requirement already satisfied: scipy in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (1.10.1) Collecting nltk (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) Using cached nltk-3.8.1-py3-none-any.whl (1.5 MB) Requirement already satisfied: sentencepiece in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (0.1.99) Requirement already satisfied: huggingface-hub>=0.4.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (0.15.1) Requirement already satisfied: Werkzeug>=2.3.3 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (2.3.6) Requirement already satisfied: Jinja2>=3.1.2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (3.1.2) Requirement already satisfied: itsdangerous>=2.1.2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (2.1.2) Requirement already satisfied: click>=8.1.3 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (8.1.3) Requirement already satisfied: blinker>=1.6.2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (1.6.2) Requirement already satisfied: filelock in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (3.9.0) Requirement already satisfied: fsspec in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (2023.6.0) Requirement already satisfied: pyyaml>=5.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (6.0) Requirement already satisfied: typing-extensions>=3.7.4.3 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (4.6.3) Requirement already satisfied: packaging>=20.9 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (23.1) Requirement already satisfied: sympy in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch>=1.6.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (1.11.1) Requirement already satisfied: networkx in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch>=1.6.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (2.8.4) Requirement already satisfied: colorama in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from tqdm->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (0.4.6) Requirement already satisfied: regex!=2019.12.17 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from transformers<5.0.0,>=4.6.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (2023.6.3) Requirement already satisfied: tokenizers!=0.11.3,<0.14,>=0.11.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from transformers<5.0.0,>=4.6.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (0.13.3) Requirement already satisfied: safetensors>=0.3.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from transformers<5.0.0,>=4.6.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (0.3.1) Collecting joblib (from nltk->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) Using cached joblib-1.2.0-py3-none-any.whl (297 kB) Requirement already satisfied: charset-normalizer<4,>=2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (2.0.4) Requirement already satisfied: idna<4,>=2.5 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (3.4) Requirement already satisfied: urllib3<3,>=1.21.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (1.26.16) Requirement already satisfied: certifi>=2017.4.17 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (2023.5.7) Collecting threadpoolctl>=2.0.0 (from scikit-learn->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB) Requirement already satisfied: pillow!=8.3.,>=5.3.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torchvision->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (9.5.0) Requirement already satisfied: MarkupSafe>=2.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from Jinja2>=3.1.2->Flask>=0.8->flask_cloudflared==0.0.12->-r extensions\openai\requirements.txt (line 1)) (2.1.1) Requirement already satisfied: mpmath>=0.19 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sympy->torch>=1.6.0->sentence-transformers->-r extensions\openai\requirements.txt (line 2)) (1.2.1) Installing collected packages: threadpoolctl, joblib, scikit-learn, nltk, sentence-transformers Successfully installed joblib-1.2.0 nltk-3.8.1 scikit-learn-1.2.2 sentence-transformers-2.2.2 threadpoolctl-3.1.0 Requirement already satisfied: ipython in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from -r extensions\silero_tts\requirements.txt (line 1)) (8.14.0) Collecting num2words (from -r extensions\silero_tts\requirements.txt (line 2)) Using cached num2words-0.5.12-py3-none-any.whl (125 kB) Collecting omegaconf (from -r extensions\silero_tts\requirements.txt (line 3)) Using cached omegaconf-2.3.0-py3-none-any.whl (79 kB) Requirement already satisfied: pydub in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from -r extensions\silero_tts\requirements.txt (line 4)) (0.25.1) Requirement already satisfied: PyYAML in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from -r extensions\silero_tts\requirements.txt (line 5)) (6.0) Requirement already satisfied: backcall in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.2.0) Requirement already satisfied: decorator in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (5.1.1) Requirement already satisfied: jedi>=0.16 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.18.2) Requirement already satisfied: matplotlib-inline in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.1.6) Requirement already satisfied: pickleshare in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.7.5) Requirement already satisfied: prompt-toolkit!=3.0.37,<3.1.0,>=3.0.30 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (3.0.38) Requirement already satisfied: pygments>=2.4.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (2.15.1) Requirement already satisfied: stack-data in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.6.2) Requirement already satisfied: traitlets>=5 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (5.9.0) Requirement already satisfied: colorama in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.4.6) Collecting docopt>=0.6.2 (from num2words->-r extensions\silero_tts\requirements.txt (line 2)) Using cached docopt-0.6.2-py2.py3-none-any.whl Collecting antlr4-python3-runtime==4.9. (from omegaconf->-r extensions\silero_tts\requirements.txt (line 3)) Using cached antlr4_python3_runtime-4.9.3-py3-none-any.whl Requirement already satisfied: parso<0.9.0,>=0.8.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from jedi>=0.16->ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.8.3) Requirement already satisfied: wcwidth in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from prompt-toolkit!=3.0.37,<3.1.0,>=3.0.30->ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.2.6) Requirement already satisfied: executing>=1.2.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from stack-data->ipython->-r extensions\silero_tts\requirements.txt (line 1)) (1.2.0) Requirement already satisfied: asttokens>=2.1.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from stack-data->ipython->-r extensions\silero_tts\requirements.txt (line 1)) (2.2.1) Requirement already satisfied: pure-eval in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from stack-data->ipython->-r extensions\silero_tts\requirements.txt (line 1)) (0.2.2) Requirement already satisfied: six in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from asttokens>=2.1.0->stack-data->ipython->-r extensions\silero_tts\requirements.txt (line 1)) (1.16.0) Installing collected packages: docopt, antlr4-python3-runtime, omegaconf, num2words Successfully installed antlr4-python3-runtime-4.9.3 docopt-0.6.2 num2words-0.5.12 omegaconf-2.3.0 Collecting SpeechRecognition==3.10.0 (from -r extensions\whisper_stt\requirements.txt (line 1)) Using cached SpeechRecognition-3.10.0-py2.py3-none-any.whl (32.8 MB) Collecting openai-whisper (from -r extensions\whisper_stt\requirements.txt (line 2)) Using cached openai_whisper-20230314-py3-none-any.whl Collecting soundfile (from -r extensions\whisper_stt\requirements.txt (line 3)) Using cached soundfile-0.12.1-py2.py3-none-win_amd64.whl (1.0 MB) Collecting ffmpeg (from -r extensions\whisper_stt\requirements.txt (line 4)) Using cached ffmpeg-1.4-py3-none-any.whl Requirement already satisfied: requests>=2.26.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from SpeechRecognition==3.10.0->-r extensions\whisper_stt\requirements.txt (line 1)) (2.31.0) Collecting numba (from openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) Using cached numba-0.57.0-cp310-cp310-win_amd64.whl (2.6 MB) Requirement already satisfied: numpy in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (1.24.3) Requirement already satisfied: torch in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (2.0.0) Requirement already satisfied: tqdm in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (4.65.0) Collecting more-itertools (from openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) Using cached more_itertools-9.1.0-py3-none-any.whl (54 kB) Collecting tiktoken==0.3.1 (from openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) Using cached tiktoken-0.3.1-cp310-cp310-win_amd64.whl (581 kB) Collecting ffmpeg-python==0.2.0 (from openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) Using cached ffmpeg_python-0.2.0-py3-none-any.whl (25 kB) Collecting future (from ffmpeg-python==0.2.0->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) Using cached future-0.18.3-py3-none-any.whl Requirement already satisfied: regex>=2022.1.18 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from tiktoken==0.3.1->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (2023.6.3) Requirement already satisfied: cffi>=1.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from soundfile->-r extensions\whisper_stt\requirements.txt (line 3)) (1.15.1) Requirement already satisfied: pycparser in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from cffi>=1.0->soundfile->-r extensions\whisper_stt\requirements.txt (line 3)) (2.21) Requirement already satisfied: charset-normalizer<4,>=2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests>=2.26.0->SpeechRecognition==3.10.0->-r extensions\whisper_stt\requirements.txt (line 1)) (2.0.4) Requirement already satisfied: idna<4,>=2.5 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests>=2.26.0->SpeechRecognition==3.10.0->-r extensions\whisper_stt\requirements.txt (line 1)) (3.4) Requirement already satisfied: urllib3<3,>=1.21.1 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests>=2.26.0->SpeechRecognition==3.10.0->-r extensions\whisper_stt\requirements.txt (line 1)) (1.26.16) Requirement already satisfied: certifi>=2017.4.17 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from requests>=2.26.0->SpeechRecognition==3.10.0->-r extensions\whisper_stt\requirements.txt (line 1)) (2023.5.7) Collecting llvmlite<0.41,>=0.40.0dev0 (from numba->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) Using cached llvmlite-0.40.1rc1-cp310-cp310-win_amd64.whl (27.7 MB) Requirement already satisfied: filelock in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (3.9.0) Requirement already satisfied: typing-extensions in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (4.6.3) Requirement already satisfied: sympy in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (1.11.1) Requirement already satisfied: networkx in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (2.8.4) Requirement already satisfied: jinja2 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from torch->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (3.1.2) Requirement already satisfied: colorama in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from tqdm->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (0.4.6) Requirement already satisfied: MarkupSafe>=2.0 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from jinja2->torch->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (2.1.1) Requirement already satisfied: mpmath>=0.19 in c:\sd\oobabooga_windows\installer_files\env\lib\site-packages (from sympy->torch->openai-whisper->-r extensions\whisper_stt\requirements.txt (line 2)) (1.2.1) Installing collected packages: ffmpeg, more-itertools, llvmlite, future, tiktoken, SpeechRecognition, soundfile, numba, ffmpeg-python, openai-whisper Successfully installed SpeechRecognition-3.10.0 ffmpeg-1.4 ffmpeg-python-0.2.0 future-0.18.3 llvmlite-0.40.1rc1 more-itertools-9.1.0 numba-0.57.0 openai-whisper-20230314 soundfile-0.12.1 tiktoken-0.3.1 Cloning into 'GPTQ-for-LLaMa'... remote: Enumerating objects: 818, done. remote: Counting objects: 100% (818/818), done. remote: Compressing objects: 100% (324/324), done. Receiving objects: 100% (818/818), 472.67 KiB | 1.17 MiB/s, done.d 0

Resolving deltas: 100% (496/496), done. Already up to date. Processing c:\sd\oobabooga_windows\text-generation-webui\repositories\gptq-for-llama Preparing metadata (setup.py) ... done Building wheels for collected packages: quant-cuda Building wheel for quant-cuda (setup.py) ... error error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [7 lines of output] running bdist_wheel running build running build_ext C:\SD\oobabooga_windows\installer_files\env\lib\site-packages\torch\utils\cpp_extension.py:359: UserWarning: Error checking compiler version for cl: [WinError 2] The system cannot find the file specified warnings.warn(f'Error checking compiler version for {compiler}: {error}') building 'quant_cuda' extension error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/ [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for quant-cuda Running setup.py clean for quant-cuda Failed to build quant-cuda ERROR: Could not build wheels for quant-cuda, which is required to install pyproject.toml-based projects


Collecting quant-cuda==0.0.0 Using cached https://github.com/jllllll/GPTQ-for-LLaMa-Wheels/raw/main/quant_cuda-0.0.0-cp310-cp310-win_amd64.whl (398 kB) Installing collected packages: quant-cuda Successfully installed quant-cuda-0.0.0 Wheel installation success! Continuing with install..


bin C:\SD\oobabooga_windows\installer_files\env\lib\site-packages\bitsandbytes\libbitsandbytes_cuda117.dll 2023-06-14 00:57:44 INFO:Loading the extension "gallery"... Press any key to continue . . .

jllllll commented 1 year ago

@BigBlueGX I'm not sure why you would be getting Conda environment is empty. Press any key to continue . . . since your log shows that it installed successfully. Does python.exe exist in C:\SD\oobabooga_windows\installer_files\env?

BigBlueGX commented 1 year ago

@jllllll I see a pythonw.exe in the directory

jllllll commented 1 year ago

@jllllll I see a pythonw.exe in the directory

But not a python.exe? That definitely shouldn't be missing. You should check your anti-virus to see if it got removed by it. If not, then you can try re-installing python with this using cmd_windows.bat:

conda install python=3.10.10

This may break the install, not sure. A full re-install may be needed anyway.

BigBlueGX commented 1 year ago

@jllllll Disabled my antivirus software and did an automatic installation. You were right. It is now working

PotatoCreator commented 1 year ago

Didn't find a solution from you guys but! This worked for me. Get the one click installer, in start_windows.bat edit the conda download page to "https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe" save and run the bat.

ericzhangcn1980 commented 1 year ago

Didn't find a solution from you guys but! This worked for me. Get the one click installer, in start_windows.bat edit the conda download page to "https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe" save and run the bat.

It worked!

VincentJGeisler commented 1 year ago

ive tried all the above, even installed conda manually. no joy, still getting errors, just different one.

So much broken.

Downloading Miniconda from https://repo.anaconda.com/miniconda/Miniconda3-py310_23.3.1-0-Linux-x86_64.sh to /mnt/c/Users/admin/Downloads/text-generation-webui-main/text-generation-webui-main/installer_files/miniconda_installer.sh % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 69.7M 100 69.7M 0 0 4248k 0 0:00:16 0:00:16 --:--:-- 3517k ERROR: File or directory already exists: '/mnt/c/Users/admin/Downloads/text-generation-webui-main/text-generation-webui-main/installer_files/conda' If you want to update an existing installation, use the -u option. Miniconda version: ./start_linux.sh: line 41: /mnt/c/Users/admin/Downloads/text-generation-webui-main/text-generation-webui-main/installer_files/conda/bin/conda: No such file or directory ./start_linux.sh: line 46: /mnt/c/Users/admin/Downloads/text-generation-webui-main/text-generation-webui-main/installer_files/conda/bin/conda: No such file or directory Conda environment is empty.

jllllll commented 1 year ago

@VincentJGeisler Are you trying to install with WSL?

If so, I highly recommend using the WSL-specific installer as it contains some additional safeguards against some WSL bugs. If you prefer to run the WSL installer from within WSL instead of using the start_wsl.bat CMD script, then you can execute wsl.sh directly with bash wsl.sh and bash wsl.sh --update to have the same effect. Just keep in mind that Linux can not process Windows line endings and the file must be converted before execution. This is what the *_wsl.bat scripts do:

sed -i 's/\x0D$//' ./wsl.sh

The errors you had with start_linux.sh look like the kind that would arise from Windows line endings, so if that is the script you want to run then convert it as well:

sed -i 's/\x0D$//' ./start_linux.sh

Just make sure to run that script from within the WSL filesystem as WSL has an I/O bug that significantly slows down file operations when reading/writing files located in Windows drives (/mnt/).

VincentJGeisler commented 1 year ago

the wsl attempt was pure despairation. It failed under power shell and cmd as well. Cant find miniconda

jllllll commented 1 year ago

For the Windows scripts, try to minimize the file path length to where text-generation-webui is stored as Windows has a path length limit that python packages tend to go over. Try moving the webui files to here: C:\text-generation-webui\

That said, WSL works just fine and some people prefer it.

jllllll commented 1 year ago

Another thing to consider with Windows is the user account name. Windows has some restrictions on what the user account can be named, but not nearly enough. Windows will happily allow you to name your user account with names that break a lot of software, such as names with spaces.

I remember one instance from several months ago where someone had a user account named admin and had to make a new user account just to run the webui. If I remember correctly, they eventually had to reinstall Windows entirely due to issues arising from that account name. It could have been something else causing the problem, but it is something to consider.

VincentJGeisler commented 1 year ago

user account named admin

Gah!!!! curse my sysadmin ways!! that's probably it. The first thing I do on Windows is create a privileged admin account, do what I need to to get things set up then disable it to make sure it doesn't sneak in later not disabled with a blank password (yes that was a thing at one point in time)

Ok I'll blow this away and start over. Probably with ubuntu and just scrap the whole windows thing. started out with windows because I started off with LM studio. I figured it would be quick and easy, that turned out it doesn't like EsxI running on blades. so 144 xeon cores, half a TB of ram and 2 A6000's and I couldn't get a 7b model to load.

which is how i ended up here.

anyways mystery solved, reinstalling, thanks for the heads up

SaltyBarnacles commented 1 year ago

I had the same issue, all my files inside the folder started showing up the error "Miniconda hook not found" and "The system cannot find the path specified" ...after some time.. i tried manually installing miniconda inside my folder. by opening the file that the bat file automatically downloaded.. and... well.. ...it seems that miniconda is unable to install in folders that start with "!" inside. ..my folder was called "!TextGenerationGUI" because i wanted it to always stay on top of other folders.. ..but well.. miniconda is unable to install in folders with "!". so i had rename my folder. ..now, after removing the "!" from the folder name, the problem seems to be solved.

I think that the installer should warn users that it is impossible to run/install the text generation webui inside folders that have special characters like ! ...even if i doubt anyone beside me puts ! inside the folder name because he wants the folder to always be on top of other folders..

That solved my problem! Thank you so much,

github-actions[bot] commented 11 months ago

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

AdlemElvis commented 2 months ago

I had this problem, in my case, I solved it by removing the accents from the "Naração" folder to "Narracao", and it worked. Tive esse problema, no meu caso, resolvi tirando os acentos da pasta "Narração" para "Narracao", e funcionou.