Closed mistrjirka closed 4 months ago
I tried to install text generation webui on armbian system. But start_linux.sh crashes (CPU MODE).
Try to install this on armbian (python version 3.11.2)
No response
Downloading Miniconda from https://repo.anaconda.com/miniconda/Miniconda3-py310_23.3.1-0-Linux-aarch64.sh to /home/jirka/programy/text-generation-webui/installer_files/miniconda_installer.sh % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 50.3M 100 50.3M 0 0 6469k 0 0:00:07 0:00:07 --:--:-- 6692k PREFIX=/home/jirka/programy/text-generation-webui/installer_files/conda Unpacking payload ... Installing base environment... Downloading and Extracting Packages Downloading and Extracting Packages Preparing transaction: done Executing transaction: done installation finished. Miniconda version: conda 23.3.1 Collecting package metadata (current_repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 23.3.1 latest version: 24.1.2 Please update conda by running $ conda update -n base -c defaults conda Or to minimize the number of packages updated during conda update use conda install conda=24.1.2 ## Package Plan ## environment location: /home/jirka/programy/text-generation-webui/installer_files/env added / updated specs: - python=3.11 The following packages will be downloaded: package | build ---------------------------|----------------- bzip2-1.0.8 | h998d150_5 211 KB ca-certificates-2023.12.12 | hd43f75c_0 126 KB libffi-3.4.4 | h419075a_0 139 KB openssl-3.0.13 | h2f4d8fa_0 5.3 MB pip-23.3.1 | py311hd43f75c_0 3.3 MB python-3.11.8 | h4bb2201_0 15.5 MB setuptools-68.2.2 | py311hd43f75c_0 1.2 MB sqlite-3.41.2 | h998d150_0 1.4 MB tzdata-2024a | h04d1e81_0 116 KB wheel-0.41.2 | py311hd43f75c_0 141 KB xz-5.4.6 | h998d150_0 662 KB ------------------------------------------------------------ Total: 28.1 MB The following NEW packages will be INSTALLED: _libgcc_mutex pkgs/main/linux-aarch64::_libgcc_mutex-0.1-main _openmp_mutex pkgs/main/linux-aarch64::_openmp_mutex-5.1-51_gnu bzip2 pkgs/main/linux-aarch64::bzip2-1.0.8-h998d150_5 ca-certificates pkgs/main/linux-aarch64::ca-certificates-2023.12.12-hd43f75c_0 ld_impl_linux-aar~ pkgs/main/linux-aarch64::ld_impl_linux-aarch64-2.38-h8131f2d_1 libffi pkgs/main/linux-aarch64::libffi-3.4.4-h419075a_0 libgcc-ng pkgs/main/linux-aarch64::libgcc-ng-11.2.0-h1234567_1 libgomp pkgs/main/linux-aarch64::libgomp-11.2.0-h1234567_1 libstdcxx-ng pkgs/main/linux-aarch64::libstdcxx-ng-11.2.0-h1234567_1 libuuid pkgs/main/linux-aarch64::libuuid-1.41.5-h998d150_0 ncurses pkgs/main/linux-aarch64::ncurses-6.4-h419075a_0 openssl pkgs/main/linux-aarch64::openssl-3.0.13-h2f4d8fa_0 pip pkgs/main/linux-aarch64::pip-23.3.1-py311hd43f75c_0 python pkgs/main/linux-aarch64::python-3.11.8-h4bb2201_0 readline pkgs/main/linux-aarch64::readline-8.2-h998d150_0 setuptools pkgs/main/linux-aarch64::setuptools-68.2.2-py311hd43f75c_0 sqlite pkgs/main/linux-aarch64::sqlite-3.41.2-h998d150_0 tk pkgs/main/linux-aarch64::tk-8.6.12-h241ca14_0 tzdata pkgs/main/noarch::tzdata-2024a-h04d1e81_0 wheel pkgs/main/linux-aarch64::wheel-0.41.2-py311hd43f75c_0 xz pkgs/main/linux-aarch64::xz-5.4.6-h998d150_0 zlib pkgs/main/linux-aarch64::zlib-1.2.13-h998d150_0 Downloading and Extracting Packages Preparing transaction: done Verifying transaction: done Executing transaction: done # # To activate this environment, use # # $ conda activate /home/jirka/programy/text-generation-webui/installer_files/env # # To deactivate an active environment, use # # $ conda deactivate What is your GPU? A) NVIDIA B) AMD (Linux/MacOS only. Requires ROCm SDK 5.6 on Linux) C) Apple M Series D) Intel Arc (IPEX) N) None (I want to run models in CPU mode) Input> N ******************************************************************* * Adding the --cpu flag to CMD_FLAGS.txt. ******************************************************************* ******************************************************************* * Installing PyTorch. ******************************************************************* Collecting package metadata (current_repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 23.3.1 latest version: 24.1.2 Please update conda by running $ conda update -n base -c defaults conda Or to minimize the number of packages updated during conda update use conda install conda=24.1.2 ## Package Plan ## environment location: /home/jirka/programy/text-generation-webui/installer_files/env added / updated specs: - git - ninja The following packages will be downloaded: package | build ---------------------------|----------------- c-ares-1.19.1 | h998d150_0 123 KB curl-8.5.0 | h6ac735f_0 89 KB expat-2.5.0 | h419075a_0 151 KB gdbm-1.18 | hf59d7a7_4 205 KB gettext-0.21.0 | h0cce8dc_1 3.3 MB git-2.40.1 | pl5340h372b8bf_1 13.1 MB icu-73.1 | h419075a_0 26.2 MB krb5-1.20.1 | h2e2fba8_1 1.5 MB libcurl-8.5.0 | hfa2bbb0_0 434 KB libedit-3.1.20230828 | h998d150_0 193 KB libev-4.33 | hfd63f10_1 113 KB libnghttp2-1.57.0 | hb788212_0 735 KB libssh2-1.10.0 | h6ac735f_2 315 KB libxml2-2.10.4 | h045d036_1 806 KB ninja-1.10.2 | hd43f75c_5 8 KB ninja-base-1.10.2 | h59a28a9_5 118 KB pcre2-10.42 | hcfaa891_0 1.3 MB perl-5.34.0 | h998d150_2 12.5 MB ------------------------------------------------------------ Total: 61.1 MB The following NEW packages will be INSTALLED: c-ares pkgs/main/linux-aarch64::c-ares-1.19.1-h998d150_0 curl pkgs/main/linux-aarch64::curl-8.5.0-h6ac735f_0 expat pkgs/main/linux-aarch64::expat-2.5.0-h419075a_0 gdbm pkgs/main/linux-aarch64::gdbm-1.18-hf59d7a7_4 gettext pkgs/main/linux-aarch64::gettext-0.21.0-h0cce8dc_1 git pkgs/main/linux-aarch64::git-2.40.1-pl5340h372b8bf_1 icu pkgs/main/linux-aarch64::icu-73.1-h419075a_0 krb5 pkgs/main/linux-aarch64::krb5-1.20.1-h2e2fba8_1 libcurl pkgs/main/linux-aarch64::libcurl-8.5.0-hfa2bbb0_0 libedit pkgs/main/linux-aarch64::libedit-3.1.20230828-h998d150_0 libev pkgs/main/linux-aarch64::libev-4.33-hfd63f10_1 libnghttp2 pkgs/main/linux-aarch64::libnghttp2-1.57.0-hb788212_0 libssh2 pkgs/main/linux-aarch64::libssh2-1.10.0-h6ac735f_2 libxml2 pkgs/main/linux-aarch64::libxml2-2.10.4-h045d036_1 ninja pkgs/main/linux-aarch64::ninja-1.10.2-hd43f75c_5 ninja-base pkgs/main/linux-aarch64::ninja-base-1.10.2-h59a28a9_5 pcre2 pkgs/main/linux-aarch64::pcre2-10.42-hcfaa891_0 perl pkgs/main/linux-aarch64::perl-5.34.0-h998d150_2 Downloading and Extracting Packages Preparing transaction: done Verifying transaction: done Executing transaction: done Looking in indexes: https://download.pytorch.org/whl/cpu Collecting torch==2.2.1 Using cached https://download.pytorch.org/whl/cpu/torch-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (86.6 MB) Collecting torchvision==0.17.1 Using cached https://download.pytorch.org/whl/cpu/torchvision-0.17.1-cp311-cp311-linux_aarch64.whl (14.0 MB) Collecting torchaudio==2.2.1 Using cached https://download.pytorch.org/whl/cpu/torchaudio-2.2.1-cp311-cp311-linux_aarch64.whl (1.7 MB) Collecting filelock (from torch==2.2.1) Using cached https://download.pytorch.org/whl/filelock-3.9.0-py3-none-any.whl (9.7 kB) Collecting typing-extensions>=4.8.0 (from torch==2.2.1) Using cached https://download.pytorch.org/whl/typing_extensions-4.8.0-py3-none-any.whl (31 kB) Collecting sympy (from torch==2.2.1) Using cached https://download.pytorch.org/whl/sympy-1.12-py3-none-any.whl (5.7 MB) Collecting networkx (from torch==2.2.1) Using cached https://download.pytorch.org/whl/networkx-3.2.1-py3-none-any.whl (1.6 MB) Collecting jinja2 (from torch==2.2.1) Using cached https://download.pytorch.org/whl/Jinja2-3.1.2-py3-none-any.whl (133 kB) Collecting fsspec (from torch==2.2.1) Using cached https://download.pytorch.org/whl/fsspec-2023.4.0-py3-none-any.whl (153 kB) Collecting numpy (from torchvision==0.17.1) Using cached https://download.pytorch.org/whl/numpy-1.26.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (14.2 MB) Collecting pillow!=8.3.*,>=5.3.0 (from torchvision==0.17.1) Using cached https://download.pytorch.org/whl/pillow-10.2.0-cp311-cp311-manylinux_2_28_aarch64.whl (4.3 MB) Collecting MarkupSafe>=2.0 (from jinja2->torch==2.2.1) Using cached https://download.pytorch.org/whl/MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (28 kB) Collecting mpmath>=0.19 (from sympy->torch==2.2.1) Using cached https://download.pytorch.org/whl/mpmath-1.3.0-py3-none-any.whl (536 kB) Installing collected packages: mpmath, typing-extensions, sympy, pillow, numpy, networkx, MarkupSafe, fsspec, filelock, jinja2, torch, torchvision, torchaudio Successfully installed MarkupSafe-2.1.3 filelock-3.9.0 fsspec-2023.4.0 jinja2-3.1.2 mpmath-1.3.0 networkx-3.2.1 numpy-1.26.3 pillow-10.2.0 sympy-1.12 torch-2.2.1 torchaudio-2.2.1 torchvision-0.17.1 typing-extensions-4.8.0 Collecting py-cpuinfo==9.0.0 Using cached py_cpuinfo-9.0.0-py3-none-any.whl.metadata (794 bytes) Using cached py_cpuinfo-9.0.0-py3-none-any.whl (22 kB) Installing collected packages: py-cpuinfo Successfully installed py-cpuinfo-9.0.0 ******************************************************************* * Updating the local copy of the repository with "git pull" ******************************************************************* Already up to date. ******************************************************************* * Installing webui requirements from file: requirements_noavx2.txt ******************************************************************* TORCH: 2.2.1 Ignoring bitsandbytes: markers 'platform_system == "Windows"' don't match your environment Ignoring llama-cpp-python: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"' don't match your environment Ignoring llama-cpp-python: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"' don't match your environment Ignoring llama-cpp-python: markers 'platform_system == "Windows" and python_version == "3.11"' don't match your environment Ignoring llama-cpp-python: markers 'platform_system == "Windows" and python_version == "3.10"' don't match your environment Ignoring llama-cpp-python-cuda: markers 'platform_system == "Windows" and python_version == "3.11"' don't match your environment Ignoring llama-cpp-python-cuda: markers 'platform_system == "Windows" and python_version == "3.10"' don't match your environment Ignoring llama-cpp-python-cuda: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"' don't match your environment Ignoring llama-cpp-python-cuda: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"' don't match your environment Ignoring llama-cpp-python-cuda-tensorcores: markers 'platform_system == "Windows" and python_version == "3.11"' don't match your environment Ignoring llama-cpp-python-cuda-tensorcores: markers 'platform_system == "Windows" and python_version == "3.10"' don't match your environment Ignoring llama-cpp-python-cuda-tensorcores: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"' don't match your environment Ignoring llama-cpp-python-cuda-tensorcores: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"' don't match your environment Ignoring auto-gptq: markers 'platform_system == "Windows" and python_version == "3.11"' don't match your environment Ignoring auto-gptq: markers 'platform_system == "Windows" and python_version == "3.10"' don't match your environment Ignoring auto-gptq: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"' don't match your environment Ignoring auto-gptq: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"' don't match your environment Ignoring exllamav2: markers 'platform_system == "Windows" and python_version == "3.11"' don't match your environment Ignoring exllamav2: markers 'platform_system == "Windows" and python_version == "3.10"' don't match your environment Ignoring exllamav2: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"' don't match your environment Ignoring exllamav2: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"' don't match your environment Collecting exllamav2==0.0.15 (from -r temp_requirements.txt (line 63)) Downloading https://github.com/oobabooga/exllamav2/releases/download/v0.0.15/exllamav2-0.0.15-py3-none-any.whl (137 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 137.9/137.9 kB 1.3 MB/s eta 0:00:00 Ignoring flash-attn: markers 'platform_system == "Windows" and python_version == "3.11"' don't match your environment Ignoring flash-attn: markers 'platform_system == "Windows" and python_version == "3.10"' don't match your environment Ignoring flash-attn: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"' don't match your environment Ignoring flash-attn: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"' don't match your environment Ignoring gptq-for-llama: markers 'platform_system == "Windows" and python_version == "3.11"' don't match your environment Ignoring gptq-for-llama: markers 'platform_system == "Windows" and python_version == "3.10"' don't match your environment Ignoring gptq-for-llama: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"' don't match your environment Ignoring gptq-for-llama: markers 'platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"' don't match your environment Collecting ctransformers==0.2.27+cu121 (from -r temp_requirements.txt (line 72)) Downloading https://github.com/jllllll/ctransformers-cuBLAS-wheels/releases/download/AVX/ctransformers-0.2.27+cu121-py3-none-any.whl (15.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 15.6/15.6 MB 1.7 MB/s eta 0:00:00 Collecting accelerate==0.27.* (from -r temp_requirements.txt (line 1)) Using cached accelerate-0.27.2-py3-none-any.whl.metadata (18 kB) Collecting colorama (from -r temp_requirements.txt (line 2)) Using cached colorama-0.4.6-py2.py3-none-any.whl.metadata (17 kB) Collecting datasets (from -r temp_requirements.txt (line 3)) Using cached datasets-2.18.0-py3-none-any.whl.metadata (20 kB) Collecting einops (from -r temp_requirements.txt (line 4)) Using cached einops-0.7.0-py3-none-any.whl.metadata (13 kB) Collecting gradio==3.50.* (from -r temp_requirements.txt (line 5)) Using cached gradio-3.50.2-py3-none-any.whl.metadata (17 kB) Collecting hqq==0.1.5 (from -r temp_requirements.txt (line 6)) Using cached hqq-0.1.5.tar.gz (31 kB) Preparing metadata (setup.py) ... done Requirement already satisfied: jinja2==3.1.2 in ./installer_files/env/lib/python3.11/site-packages (from -r temp_requirements.txt (line 7)) (3.1.2) Collecting lm_eval==0.3.0 (from -r temp_requirements.txt (line 8)) Using cached lm_eval-0.3.0-py3-none-any.whl.metadata (6.8 kB) Collecting markdown (from -r temp_requirements.txt (line 9)) Using cached Markdown-3.5.2-py3-none-any.whl.metadata (7.0 kB) Requirement already satisfied: numpy==1.26.* in ./installer_files/env/lib/python3.11/site-packages (from -r temp_requirements.txt (line 10)) (1.26.3) Collecting numpy==1.26.* (from -r temp_requirements.txt (line 10)) Using cached numpy-1.26.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (62 kB) Collecting optimum==1.17.* (from -r temp_requirements.txt (line 11)) Using cached optimum-1.17.1-py3-none-any.whl.metadata (18 kB) Collecting pandas (from -r temp_requirements.txt (line 12)) Using cached pandas-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (19 kB) Collecting peft==0.8.* (from -r temp_requirements.txt (line 13)) Using cached peft-0.8.2-py3-none-any.whl.metadata (25 kB) Requirement already satisfied: Pillow>=9.5.0 in ./installer_files/env/lib/python3.11/site-packages (from -r temp_requirements.txt (line 14)) (10.2.0) Collecting pyyaml (from -r temp_requirements.txt (line 15)) Using cached PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (2.1 kB) Collecting requests (from -r temp_requirements.txt (line 16)) Using cached requests-2.31.0-py3-none-any.whl.metadata (4.6 kB) Collecting rich (from -r temp_requirements.txt (line 17)) Using cached rich-13.7.1-py3-none-any.whl.metadata (18 kB) Collecting safetensors==0.4.* (from -r temp_requirements.txt (line 18)) Using cached safetensors-0.4.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (3.8 kB) Collecting scipy (from -r temp_requirements.txt (line 19)) Using cached scipy-1.12.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (60 kB) Collecting sentencepiece (from -r temp_requirements.txt (line 20)) Using cached sentencepiece-0.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (7.7 kB) Collecting tensorboard (from -r temp_requirements.txt (line 21)) Using cached tensorboard-2.16.2-py3-none-any.whl.metadata (1.6 kB) Collecting transformers==4.38.* (from -r temp_requirements.txt (line 22)) Using cached transformers-4.38.2-py3-none-any.whl.metadata (130 kB) Collecting tqdm (from -r temp_requirements.txt (line 23)) Using cached tqdm-4.66.2-py3-none-any.whl.metadata (57 kB) Collecting wandb (from -r temp_requirements.txt (line 24)) Using cached wandb-0.16.4-py3-none-any.whl.metadata (10 kB) Collecting SpeechRecognition==3.10.0 (from -r temp_requirements.txt (line 27)) Using cached SpeechRecognition-3.10.0-py2.py3-none-any.whl.metadata (28 kB) Collecting flask_cloudflared==0.0.14 (from -r temp_requirements.txt (line 28)) Using cached flask_cloudflared-0.0.14-py3-none-any.whl.metadata (4.6 kB) Collecting sse-starlette==1.6.5 (from -r temp_requirements.txt (line 29)) Using cached sse_starlette-1.6.5-py3-none-any.whl.metadata (6.7 kB) Collecting tiktoken (from -r temp_requirements.txt (line 30)) Using cached tiktoken-0.6.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (6.6 kB) Collecting bitsandbytes==0.42.* (from -r temp_requirements.txt (line 33)) Using cached bitsandbytes-0.42.0-py3-none-any.whl.metadata (9.9 kB) ERROR: Ignored the following versions that require a different python version: 1.21.2 Requires-Python >=3.7,<3.11; 1.21.3 Requires-Python >=3.7,<3.11; 1.21.4 Requires-Python >=3.7,<3.11; 1.21.5 Requires-Python >=3.7,<3.11; 1.21.6 Requires-Python >=3.7,<3.11; 1.6.2 Requires-Python >=3.7,<3.10; 1.6.3 Requires-Python >=3.7,<3.10; 1.7.0 Requires-Python >=3.7,<3.10; 1.7.1 Requires-Python >=3.7,<3.10; 1.7.2 Requires-Python >=3.7,<3.11; 1.7.3 Requires-Python >=3.7,<3.11; 1.8.0 Requires-Python >=3.8,<3.11; 1.8.0rc1 Requires-Python >=3.8,<3.11; 1.8.0rc2 Requires-Python >=3.8,<3.11; 1.8.0rc3 Requires-Python >=3.8,<3.11; 1.8.0rc4 Requires-Python >=3.8,<3.11; 1.8.1 Requires-Python >=3.8,<3.11 ERROR: Could not find a version that satisfies the requirement autoawq==0.1.8 (from versions: none) ERROR: No matching distribution found for autoawq==0.1.8 Command '. "/home/jirka/programy/text-generation-webui/installer_files/conda/etc/profile.d/conda.sh" && conda activate "/home/jirka/programy/text-generation-webui/installer_files/env" && python -m pip install -r temp_requirements.txt --upgrade' failed with exit status code '1'. Exiting now. Try running the start/update script again.
jirka@solar ----------- █ █ █ █ █ █ █ █ █ █ █ OS: Armbian (24.2.1) aarch64 ███████████████████████ Host: Hardkernel ODROID-M1 ▄▄██ ██▄▄ Kernel: 6.6.2-edge-rk3568-odroid ▄▄██ ███████████ ██▄▄ Uptime: 15 mins ▄▄██ ██ ██ ██▄▄ Packages: 579 (dpkg) ▄▄██ ██ ██ ██▄▄ Shell: zsh 5.9 ▄▄██ ██ ██ ██▄▄ Terminal: /dev/pts/0 ▄▄██ █████████████ ██▄▄ CPU: (4) @ 1.992GHz ▄▄██ ██ ██ ██▄▄ Memory: 485MiB / 7692MiB ▄▄██ ██ ██ ██▄▄ ▄▄██ ██ ██ ██▄▄ ▄▄██ ██▄▄ ███████████████████████ █ █ █ █ █ █ █ █ █ █ █
This issue has been closed due to inactivity for 2 months. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.
Describe the bug
I tried to install text generation webui on armbian system. But start_linux.sh crashes (CPU MODE).
Is there an existing issue for this?
Reproduction
Try to install this on armbian (python version 3.11.2)
Screenshot
No response
Logs
System Info