ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://parisneo.github.io/lollms-webui/
Apache License 2.0
4.23k stars 536 forks source link

Windows install issues #557

Open SoftologyPro opened 2 weeks ago

SoftologyPro commented 2 weeks ago

Looks like the latest Windows install win_install.bat has issues.

Building wheels for collected packages: wget
  Building wheel for wget (setup.py) ... done
  Created wheel for wget: filename=wget-3.2-py3-none-any.whl size=9683 sha256=a8a379249737a9e9e0cfff61b7e5320230f50ead675d42e840d455a30a8ec95c
  Stored in directory: D:\VoC_Systems\LoLLMS\installer_files\temp\pip-ephem-wheel-cache-6ur9cqak\wheels\40\b3\0f\a40dbd1c6861731779f62cc4babcb234387e11d697df70ee97
Successfully built wget
Installing collected packages: wget, sortedcontainers, pyreadline3, mpmath, flatbuffers, websocket-client, typing-extensions, sympy, soupsieve, sniffio, setuptools, safetensors, regex, pyyaml, python-multipart, PyPDF2, pycodestyle, protobuf, Pillow, numpy, multidict, lxml, jiter, humanfriendly, h11, fsspec, frozenlist, filelock, click, bidict, attrs, ascii_colors, annotated-types, aiohappyeyeballs, yarl, wsproto, uvicorn, tiktoken, pydantic-core, pipmaster, outcome, huggingface-hub, httpcore, coloredlogs, beautifulsoup4, autopep8, anyio, aiosignal, trio, tokenizers, starlette, simple-websocket, pydantic, onnxruntime, httpx, googlesearch-python, bs4, aiohttp, trio-websocket, transformers, python-engineio, openai, freedom-search, fastapi, selenium, python-socketio, lollmsvectordb, scrapemaster, lollms
  Attempting uninstall: setuptools
    Found existing installation: setuptools 72.1.0
    Uninstalling setuptools-72.1.0:
      Successfully uninstalled setuptools-72.1.0
  DEPRECATION: Legacy editable install of lollms==10.0.0 from file:///D:/VoC_Systems/LoLLMS/lollms-webui/lollms_core (setup.py develop) is deprecated. pip 25.0 will enforce this behaviour change. A possible replacement is to add a pyproject.toml or enable --use-pep517, and use setuptools >= 64. If the resulting installation is not behaving as expected, try using --config-settings editable_mode=compat. Please consult the setuptools documentation for more information. Discussion can be found at https://github.com/pypa/pip/issues/11457
  Running setup.py develop for lollms
Successfully installed Pillow-10.4.0 PyPDF2-3.0.1 aiohappyeyeballs-2.4.0 aiohttp-3.10.5 aiosignal-1.3.1 annotated-types-0.7.0 anyio-4.4.0 ascii_colors-0.4.2 attrs-24.2.0 autopep8-2.3.1 beautifulsoup4-4.12.3 bidict-0.23.1 bs4-0.0.2 click-8.1.7 coloredlogs-15.0.1 fastapi-0.112.2 filelock-3.15.4 flatbuffers-24.3.25 freedom-search-0.1.9 frozenlist-1.4.1 fsspec-2024.6.1 googlesearch-python-1.2.5 h11-0.14.0 httpcore-1.0.5 httpx-0.27.2 huggingface-hub-0.24.6 humanfriendly-10.0 jiter-0.5.0 lollms-10.0.0 lollmsvectordb-1.0.2 lxml-5.3.0 mpmath-1.3.0 multidict-6.0.5 numpy-2.1.0 onnxruntime-1.19.0 openai-1.43.0 outcome-1.3.0.post0 pipmaster-0.2.4 protobuf-5.28.0 pycodestyle-2.12.1 pydantic-2.8.2 pydantic-core-2.20.1 pyreadline3-3.4.1 python-engineio-4.9.1 python-multipart-0.0.9 python-socketio-5.11.3 pyyaml-6.0.2 regex-2024.7.24 safetensors-0.4.4 scrapemaster-0.2.0 selenium-4.24.0 setuptools-70.2.0 simple-websocket-1.0.0 sniffio-1.3.1 sortedcontainers-2.4.0 soupsieve-2.6 starlette-0.38.2 sympy-1.13.2 tiktoken-0.7.0 tokenizers-0.19.1 transformers-4.44.2 trio-0.26.2 trio-websocket-0.11.1 typing-extensions-4.12.2 uvicorn-0.30.6 websocket-client-1.8.0 wget-3.2 wsproto-1.2.0 yarl-1.9.6
The system cannot find the path specified.
Obtaining file:///D:/VoC_Systems/LoLLMS
ERROR: file:///D:/VoC_Systems/LoLLMS does not appear to be a Python project: neither 'setup.py' nor 'pyproject.toml' found.
Requirement already satisfied: mpmath==1.3.0 in d:\voc_systems\lollms\installer_files\lollms_env\lib\site-packages (from -r requirements.txt (line 1)) (1.3.0)
Collecting numpy==1.26.0 (from -r requirements.txt (line 2))
  Downloading numpy-1.26.0-cp311-cp311-win_amd64.whl.metadata (61 kB)
Collecting sympy==1.12 (from -r requirements.txt (line 3))
  Downloading sympy-1.12-py3-none-any.whl.metadata (12 kB)
Collecting typing_extensions==4.8.0 (from -r requirements.txt (line 4))
  Downloading typing_extensions-4.8.0-py3-none-any.whl.metadata (3.0 kB)
Collecting urllib3==2.0.5 (from -r requirements.txt (line 5))
  Downloading urllib3-2.0.5-py3-none-any.whl.metadata (6.6 kB)
Downloading numpy-1.26.0-cp311-cp311-win_amd64.whl (15.8 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 15.8/15.8 MB 10.3 MB/s eta 0:00:00
Downloading sympy-1.12-py3-none-any.whl (5.7 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.7/5.7 MB 10.6 MB/s eta 0:00:00
Downloading typing_extensions-4.8.0-py3-none-any.whl (31 kB)
Downloading urllib3-2.0.5-py3-none-any.whl (123 kB)
Installing collected packages: urllib3, typing_extensions, sympy, numpy
  Attempting uninstall: urllib3
    Found existing installation: urllib3 2.2.2
    Uninstalling urllib3-2.2.2:
      Successfully uninstalled urllib3-2.2.2
  Attempting uninstall: typing_extensions
    Found existing installation: typing_extensions 4.12.2
    Uninstalling typing_extensions-4.12.2:
      Successfully uninstalled typing_extensions-4.12.2
  Attempting uninstall: sympy
    Found existing installation: sympy 1.13.2
    Uninstalling sympy-1.13.2:
      Successfully uninstalled sympy-1.13.2
  Attempting uninstall: numpy
    Found existing installation: numpy 2.1.0
    Uninstalling numpy-2.1.0:
      Successfully uninstalled numpy-2.1.0
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
openai 1.43.0 requires typing-extensions<5,>=4.11, but you have typing-extensions 4.8.0 which is incompatible.
selenium 4.24.0 requires typing_extensions~=4.9, but you have typing-extensions 4.8.0 which is incompatible.
Successfully installed numpy-1.26.0 sympy-1.12 typing_extensions-4.8.0 urllib3-2.0.5
The system cannot find the path specified.
The system cannot find the path specified.

Then it asks

Select the default binding to be installed:
1) None (install the binding later)
2) Local binding - ollama
3) Local binding - python_llama_cpp
4) Local binding - bs_exllamav2
5) Remote binding - groq
6) Remote binding - open_router
7) Remote binding - open_ai
8) Remote binding - mistral_ai
9) Remote binding - gemini
10) Remote binding - vllm
11) Remote binding - xAI
12) Remote binding - elf
13) Remote binding - remote lollms

I choose 2.

Type the number of your choice and press Enter: 2
python: can't open file 'D:\\zoos\\bindings_zoo\\ollama\\__init__.py': [Errno 2] No such file or directory
Installation complete.

Then when I try and run it

Starting LOLLMS Web UI...
"     ___       ___           ___       ___       ___           ___      "
"    /\__\     /\  \         /\__\     /\__\     /\__\         /\  \     "
"   /:/  /    /::\  \       /:/  /    /:/  /    /::|  |       /::\  \    "
"  /:/  /    /:/\:\  \     /:/  /    /:/  /    /:|:|  |      /:/\ \  \   "
" /:/  /    /:/  \:\  \   /:/  /    /:/  /    /:/|:|__|__   _\:\~\ \  \  "
"/:/__/    /:/__/ \:\__\ /:/__/    /:/__/    /:/ |::::\__\ /\ \:\ \ \__\ "
"\:\  \    \:\  \ /:/  / \:\  \    \:\  \    \/__/~~/:/  / \:\ \:\ \/__/ "
" \:\  \    \:\  /:/  /   \:\  \    \:\  \         /:/  /   \:\ \:\__\   "
"  \:\  \    \:\/:/  /     \:\  \    \:\  \       /:/  /     \:\/:/  /   "
"   \:\__\    \::/  /       \:\__\    \:\__\     /:/  /       \::/  /    "
"    \/__/     \/__/         \/__/     \/__/     \/__/         \/__/     "
By ParisNeo
Traceback (most recent call last):
  File "D:\VoC_Systems\LoLLMS\lollms-webui\app.py", line 8, in <module>
    from lollms.utilities import PackageManager
  File "d:\voc_systems\lollms\lollms-webui\lollms_core\lollms\utilities.py", line 36, in <module>
    import git
ModuleNotFoundError: No module named 'git'
Press any key to continue . . .
ParisNeo commented 1 week ago

Can you try to install using the new v12 installer?

SoftologyPro commented 1 week ago

Can you try to install using the new v12 installer?

OK trying https://github.com/ParisNeo/lollms-webui/releases/download/v12/win_install.bat still gives these errors during install...

image

Selecting the binding still fails

image

Then when starting, same error

image

So, no difference with v12 install bat.

RainerEmrich commented 1 week ago

Same issues here.

AccidentalJedi commented 1 week ago

Same here. After the new win_install.bat, I now get a "no sklearn" error. manually installing iit in conda gave an error that I forgot to copy. apologies.

AccidentalJedi commented 1 week ago

Still getting the Git error with the new installer image

ParisNeo commented 1 week ago

Hi. I have fixed this problem in the new installer. Please download the new installer from the v12 release and it should work

SoftologyPro commented 1 week ago

Still does not work. During install (when selecting the default binding) and when running it. ModuleNotFoundError: No module named 'sklearn'

KCBF commented 1 week ago

Still does not work. During install (when selecting the default binding) and when running it. ModuleNotFoundError: No module named 'sklearn'

For example, the .bat file is at D:\LoLLM, then:

D:\LoLLM\installer_files\lollms_env\Scripts\activate cd to the folder curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py python get-pip.py pip --version pip install scikit-learn

Fixed sklearn for me, but now I got another error: No module named 'lollms'

But I have it: `(D:\LoLLM\installer_files\lollms_env) (lollms_env) D:\LoLLM\lollms-webui>pip show lollms Name: lollms Version: 10.0.0 Summary: A python library for AI personality definition Home-page: https://github.com/ParisNeo/lollms Author: Saifeddine ALOUI (ParisNeo) Author-email: parisneo_ai@gmail.com License: Location: D:\LoLLM\lollms-webui\lollms_core Editable project location: D:\LoLLM\lollms-webui\lollms_core Requires: ascii_colors, autopep8, beautifulsoup4, fastapi, freedom-search, lollmsvectordb, Pillow, pipmaster, python-multipart, python-socketio, pyyaml, requests, scrapemaster, setuptools, tqdm, uvicorn, wget Required-by:

(D:\LoLLM\installer_files\lollms_env) (lollms_env) D:\LoLLM\lollms-webui>win_run.bat 'win_run.bat' is not recognized as an internal or external command, operable program or batch file.

(D:\LoLLM\installer_files\lollms_env) (lollms_env) D:\LoLLM\lollms-webui>cd..

(D:\LoLLM\installer_files\lollms_env) (lollms_env) D:\LoLLM>win_run.bat Starting LOLLMS Web UI... " " " /_\ /\ \ /_\ /_\ /_\ /\ \ " " /:/ / /::\ \ /:/ / /:/ / /::| | /::\ \ " " /:/ / /:/\:\ \ /:/ / /:/ / /:|:| | /:/\ \ \ " " /:/ / /:/ \:\ \ /:/ / /:/ / /:/|:|| \:\~\ \ \ " "/:// /:// \:_\ /:// /:// /:/ |::::_\ /\ \:\ \ _\ " "\:\ \ \:\ \ /:/ / \:\ \ \:\ \ \//~~/:/ / \:\ \:\ \// " " \:\ \ \:\ /:/ / \:\ \ \:\ \ /:/ / \:\ \:_\ " " \:\ \ \:\/:/ / \:\ \ \:\ \ /:/ / \:\/:/ / " " \:_\ \::/ / \:_\ \:_\ /:/ / \::/ / " " \// \// \// \// \// \// " By ParisNeo Traceback (most recent call last): File "D:\LoLLM\lollms-webui\app.py", line 8, in from lollms.utilities import PackageManager ModuleNotFoundError: No module named 'lollms' Press any key to continue . . .`

navarisun1982 commented 4 days ago

the same errors mentioned in this post happened to me. on fresh install through the time I used Lollms, I found a big problem in setup and between updates. it will be a good idea to consider a smooth updates instead on starting everything from scratch on every update

ParisNeo commented 3 days ago

Hi, the sklearn should disapear if you upgrade to the last version of lollmsvectordb.

if you have already installed lollms, then you can just use the condasession script (you can fin it in the same folder as lollms install) and then type:

pip install scikit-learn

This should fix it

SoftologyPro commented 3 days ago

I just installed the latest win_install.bat, ie https://github.com/ParisNeo/lollms-webui/releases/download/v12/win_install.bat in an empty directory.

Which of these is recommended? If binding later works, then maybe just use that and do not prompt users?

Select the default binding to be installed:
1) None (install the binding later)
2) Local binding - ollama
3) Local binding - python_llama_cpp
4) Local binding - bs_exllamav2
5) Remote binding - groq
6) Remote binding - open_router
7) Remote binding - open_ai
8) Remote binding - mistral_ai
9) Remote binding - gemini
10) Remote binding - vllm
11) Remote binding - xAI
12) Remote binding - elf
13) Remote binding - remote lollms

If I select option 2 I get this error

Type the number of your choice and press Enter: 2
python: can't open file 'D:\\Tests\\lollms\\lollms-webui\\zoos\\bindings_zoo\\ollama\\__init__.py': [Errno 2] No such file or directory
Installation complete.
Press any key to continue . . .

After that I run win_run.bat and get this error

(D:\Tests\lollms\installer_files\lollms_env) D:\Tests\lollms>win_run
Starting LOLLMS Web UI...
"     ___       ___           ___       ___       ___           ___      "
"    /\__\     /\  \         /\__\     /\__\     /\__\         /\  \     "
"   /:/  /    /::\  \       /:/  /    /:/  /    /::|  |       /::\  \    "
"  /:/  /    /:/\:\  \     /:/  /    /:/  /    /:|:|  |      /:/\ \  \   "
" /:/  /    /:/  \:\  \   /:/  /    /:/  /    /:/|:|__|__   _\:\~\ \  \  "
"/:/__/    /:/__/ \:\__\ /:/__/    /:/__/    /:/ |::::\__\ /\ \:\ \ \__\ "
"\:\  \    \:\  \ /:/  / \:\  \    \:\  \    \/__/~~/:/  / \:\ \:\ \/__/ "
" \:\  \    \:\  /:/  /   \:\  \    \:\  \         /:/  /   \:\ \:\__\   "
"  \:\  \    \:\/:/  /     \:\  \    \:\  \       /:/  /     \:\/:/  /   "
"   \:\__\    \::/  /       \:\__\    \:\__\     /:/  /       \::/  /    "
"    \/__/     \/__/         \/__/     \/__/     \/__/         \/__/     "
By ParisNeo
Traceback (most recent call last):
  File "D:\Tests\lollms\lollms-webui\app.py", line 8, in <module>
    from lollms.utilities import PackageManager
ModuleNotFoundError: No module named 'lollms'
Press any key to continue . . .

I can manually patch in scikit-learn (although that error seems to be fixed), but if you remember I include support for LOLLMS in Visions of Chaos. If your install fails it fails for any of my users who tries it. So, can you fix it in your installer so it works? Better to fix once in your installer than have every Windows user who tries LOLLMS to have to find this issue and make the patch themselves, if they even know how to find the env, activate the env, install scikit-learn.

Burninggod commented 1 day ago

I ran into the same problem, but still got ollama errors when I installed with pip install scikit-learn in win_conda_session

image image