Closed Naugustogi closed 1 year ago
@Naugustogi,
Yes something is wrong with your installation. It is running smoothly on my end.
I think you need to build it from source as you did with alpaca.cpp
(you only need cmake
).
Please do it and let us know if you face any issues ? Also let us know your setup and what OS are you using ?
Thanks!
@abdeladim-s
i got this gigantic error after using the install,
...
File "<string>", line 118, in build_extension
File "C:\Users\Verwender\AppData\Local\Programs\Python\Python310\lib\subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['cmake', 'C:\\Users\\Verwender\\Desktop\\pyllamacpp\\pyllamacpp\\pyllamacpp', '-DCMAKE_LIBRARY_OUTPUT_DIRECTORY=C:\\Users\\Verwender\\AppData\\Local\\Temp\\tmp_cdynw8m.build-lib\\', '-DPYTHON_EXECUTABLE=C:\\Users\\Verwender\\Desktop\\pyllamacpp\\pyllamacpp\\pyllamacpp\\env\\Scripts\\python.exe', '-DCMAKE_BUILD_TYPE=Release', '-DEXAMPLE_VERSION_INFO=1.0.7', '-A', 'x64', '-DCMAKE_LIBRARY_OUTPUT_DIRECTORY_RELEASE=C:\\Users\\Verwender\\AppData\\Local\\Temp\\tmp_cdynw8m.build-lib']' returned non-zero exit status 1.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building editable for pyllamacpp
Failed to build pyllamacpp
ERROR: Could not build wheels for pyllamacpp, which is required to install pyproject.toml-based projects
then if i would use cmake normally another error appears
-- Building for: NMake Makefiles
CMake Error at CMakeLists.txt:2 (project):
Running
'nmake' '-?'
CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
CMake Error: CMAKE_CXX_COMPILER not set, after EnableLanguage
-- Configuring incomplete, errors occurred!
what i did:
git clone --recursive https://github.com/nomic-ai/pyllamacpp && cd pyllamacpp pip install . and then i started the installer.bat
i use windows 11 rtx 3060 ti, 16gb ram, i7, ssd
I was using the wrapper for a discord bot. I don't want any executables. I think i missed that this are 2 completely different things.
I updated the Cmakelists hoping it will work on your side. Please try again now:
git clone --recursive https://github.com/abdeladim-s/pyllamacpp && cd pyllamacpp
pip install .
i use windows 11 rtx 3060 ti, 16gb ram, i7, ssd
I was using the wrapper for a discord bot. I don't want any executables. I think i missed that this are 2 completely different things.
I think you have a very good machine, why not use WSL directly !!? building C++ projects on windows is not that easy.
Yes you can create a wrapper for a discord bot! I don't understand what you mean!
@abdeladim-s
I updated the Cmakelists hoping it will work on your side. Please try again now:
git clone --recursive https://github.com/abdeladim-s/pyllamacpp && cd pyllamacpp pip install .
C:\Users\Verwender\Desktop\pyllamacpp2>git clone --recursive https://github.com/abdeladim-s/pyllamacpp && cd pyllamacpp
Cloning into 'pyllamacpp'...
remote: Enumerating objects: 650, done.
remote: Counting objects: 100% (650/650), done.
remote: Compressing objects: 100% (450/450), done.
remote: Total 650 (delta 193), reused 623 (delta 179), pack-reused 0
Receiving objects: 100% (650/650), 1.78 MiB | 1.03 MiB/s, done.
Resolving deltas: 100% (193/193), done.
Submodule 'llama.cpp' (https://github.com/ggerganov/llama.cpp.git) registered for path 'llama.cpp'
Cloning into 'C:/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp/llama.cpp'...
remote: Enumerating objects: 2246, done.
remote: Counting objects: 100% (979/979), done.
remote: Compressing objects: 100% (164/164), done.
remote: Total 2246 (delta 892), reused 824 (delta 815), pack-reused 1267
Receiving objects: 100% (2246/2246), 2.20 MiB | 1.36 MiB/s, done.
Resolving deltas: 100% (1452/1452), done.
Submodule path 'llama.cpp': checked out '3525899277d2e2bdc8ec3f0e6e40c47251608700'
C:\Users\Verwender\Desktop\pyllamacpp2\pyllamacpp>pip install .
WARNING: Ignoring invalid distribution -yllamacpp (c:\users\verwender\appdata\local\programs\python\python310\lib\site-packages)
WARNING: Ignoring invalid distribution -yllamacpp (c:\users\verwender\appdata\local\programs\python\python310\lib\site-packages)
Processing c:\users\verwender\desktop\pyllamacpp2\pyllamacpp
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Building wheels for collected packages: pyllamacpp
Building wheel for pyllamacpp (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for pyllamacpp (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [248 lines of output]
C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\config\_apply_pyprojecttoml.py:62: _WouldIgnoreField: `description` defined outside of `pyproject.toml` would be ignored.
!!
********************************************************************************
##########################################################################
# configuration would be ignored/result in error due to `pyproject.toml` #
well looks like the same error.
I'm not using WSL because i have no idea about it. I made my bot already with this repo with the pip install pyllamacpp (2.0.0), and the problem was the speed. If i run that normally on an already premade executable its answers instantly (llama.cpp or alpaca.cpp). If i use the webui text generation on gpu its 1Token/s fast. For CPU it takes a while for generating any response such like with this repo (around 60-120 seconds before generating the first word). I have no understanding in that so far. Any Information what it does would be helpful so i can see what i need todo to make it run faster. Or i have to rely on other optimisations in the future.
WSL
will make your life easier. I think you are a developer so it won't be hard for you to set it up, it is just a bunch of commands on Powershell and you have Linux inside Windows.
Anyways, Please copy/paste the entire error, so I can see from where it is coming!
WSL
will make your life easier. I think you are a developer so it won't be hard for you to set it up, it is just a bunch of commands on Powershell and you have Linux inside Windows.Anyways, Please copy/paste the entire error, so I can see from where it is coming!
@abdeladim-s here is the full error:
C:\Users\Verwender\Desktop\pyllamacpp2>git clone --recursive https://github.com/abdeladim-s/pyllamacpp && cd pyllamacpp
Cloning into 'pyllamacpp'...
remote: Enumerating objects: 650, done.
remote: Counting objects: 100% (650/650), done.
remote: Compressing objects: 100% (450/450), done.
remote: Total 650 (delta 193), reused 623 (delta 179), pack-reused 0
Receiving objects: 100% (650/650), 1.78 MiB | 1.03 MiB/s, done.
Resolving deltas: 100% (193/193), done.
Submodule 'llama.cpp' (https://github.com/ggerganov/llama.cpp.git) registered for path 'llama.cpp'
Cloning into 'C:/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp/llama.cpp'...
remote: Enumerating objects: 2246, done.
remote: Counting objects: 100% (979/979), done.
remote: Compressing objects: 100% (164/164), done.
remote: Total 2246 (delta 892), reused 824 (delta 815), pack-reused 1267
Receiving objects: 100% (2246/2246), 2.20 MiB | 1.36 MiB/s, done.
Resolving deltas: 100% (1452/1452), done.
Submodule path 'llama.cpp': checked out '3525899277d2e2bdc8ec3f0e6e40c47251608700'
C:\Users\Verwender\Desktop\pyllamacpp2\pyllamacpp>pip install .
WARNING: Ignoring invalid distribution -yllamacpp (c:\users\verwender\appdata\local\programs\python\python310\lib\site-packages)
WARNING: Ignoring invalid distribution -yllamacpp (c:\users\verwender\appdata\local\programs\python\python310\lib\site-packages)
Processing c:\users\verwender\desktop\pyllamacpp2\pyllamacpp
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Building wheels for collected packages: pyllamacpp
Building wheel for pyllamacpp (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for pyllamacpp (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [248 lines of output]
C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\config\_apply_pyprojecttoml.py:62: _WouldIgnoreField: `description` defined outside of `pyproject.toml` would be ignored.
!!
********************************************************************************
##########################################################################
# configuration would be ignored/result in error due to `pyproject.toml` #
##########################################################################
The following seems to be defined outside of `pyproject.toml`:
`description = 'Python bindings for llama.cpp'`
According to the spec (see the link below), however, setuptools CANNOT
consider this value unless `description` is listed as `dynamic`.
https://packaging.python.org/en/latest/specifications/declaring-project-metadata/
For the time being, `setuptools` will still consider the given value (as a
**transitional** measure), but please note that future releases of setuptools will
follow strictly the standard.
To prevent this warning, you can list `description` under `dynamic` or alternatively
remove the `[project]` table from your file and rely entirely on other means of
configuration.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
********************************************************************************
!!
_handle_missing_dynamic(dist, project_table)
C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\config\_apply_pyprojecttoml.py:62: _WouldIgnoreField: `requires-python` defined outside of `pyproject.toml` would be ignored.
!!
********************************************************************************
##########################################################################
# configuration would be ignored/result in error due to `pyproject.toml` #
##########################################################################
The following seems to be defined outside of `pyproject.toml`:
`requires-python = '>=3.8'`
According to the spec (see the link below), however, setuptools CANNOT
consider this value unless `requires-python` is listed as `dynamic`.
https://packaging.python.org/en/latest/specifications/declaring-project-metadata/
For the time being, `setuptools` will still consider the given value (as a
**transitional** measure), but please note that future releases of setuptools will
follow strictly the standard.
To prevent this warning, you can list `requires-python` under `dynamic` or alternatively
remove the `[project]` table from your file and rely entirely on other means of
configuration.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
********************************************************************************
!!
_handle_missing_dynamic(dist, project_table)
C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\config\_apply_pyprojecttoml.py:62: _WouldIgnoreField: `license` defined outside of `pyproject.toml` would be ignored.
!!
********************************************************************************
##########################################################################
# configuration would be ignored/result in error due to `pyproject.toml` #
##########################################################################
The following seems to be defined outside of `pyproject.toml`:
`license = 'MIT'`
According to the spec (see the link below), however, setuptools CANNOT
consider this value unless `license` is listed as `dynamic`.
https://packaging.python.org/en/latest/specifications/declaring-project-metadata/
For the time being, `setuptools` will still consider the given value (as a
**transitional** measure), but please note that future releases of setuptools will
follow strictly the standard.
To prevent this warning, you can list `license` under `dynamic` or alternatively
remove the `[project]` table from your file and rely entirely on other means of
configuration.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
********************************************************************************
!!
_handle_missing_dynamic(dist, project_table)
C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\config\_apply_pyprojecttoml.py:62: _WouldIgnoreField: `authors` defined outside of `pyproject.toml` would be ignored.
!!
********************************************************************************
##########################################################################
# configuration would be ignored/result in error due to `pyproject.toml` #
##########################################################################
The following seems to be defined outside of `pyproject.toml`:
`authors = 'Abdeladim Sadiki'`
According to the spec (see the link below), however, setuptools CANNOT
consider this value unless `authors` is listed as `dynamic`.
https://packaging.python.org/en/latest/specifications/declaring-project-metadata/
For the time being, `setuptools` will still consider the given value (as a
**transitional** measure), but please note that future releases of setuptools will
follow strictly the standard.
To prevent this warning, you can list `authors` under `dynamic` or alternatively
remove the `[project]` table from your file and rely entirely on other means of
configuration.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
********************************************************************************
!!
_handle_missing_dynamic(dist, project_table)
C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\config\_apply_pyprojecttoml.py:62: _WouldIgnoreField: `urls` defined outside of `pyproject.toml` would be ignored.
!!
********************************************************************************
##########################################################################
# configuration would be ignored/result in error due to `pyproject.toml` #
##########################################################################
The following seems to be defined outside of `pyproject.toml`:
`urls = {'Documentation': 'https://abdeladim-s.github.io/pyllamacpp', 'Source': 'https://github.com/abdeladim-s/pyllamacpp', 'Tracker': 'https://github.com/abdeladim-s/pyllamacpp/issues'}`
According to the spec (see the link below), however, setuptools CANNOT
consider this value unless `urls` is listed as `dynamic`.
https://packaging.python.org/en/latest/specifications/declaring-project-metadata/
For the time being, `setuptools` will still consider the given value (as a
**transitional** measure), but please note that future releases of setuptools will
follow strictly the standard.
To prevent this warning, you can list `urls` under `dynamic` or alternatively
remove the `[project]` table from your file and rely entirely on other means of
configuration.
By 2023-Oct-30, you need to update your project and remove deprecated calls
or your builds will no longer be supported.
********************************************************************************
!!
_handle_missing_dynamic(dist, project_table)
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-cpython-310
creating build\lib.win-amd64-cpython-310\pyllamacpp
copying .\pyllamacpp\cli.py -> build\lib.win-amd64-cpython-310\pyllamacpp
copying .\pyllamacpp\constants.py -> build\lib.win-amd64-cpython-310\pyllamacpp
copying .\pyllamacpp\model.py -> build\lib.win-amd64-cpython-310\pyllamacpp
copying .\pyllamacpp\utils.py -> build\lib.win-amd64-cpython-310\pyllamacpp
copying .\pyllamacpp\webui.py -> build\lib.win-amd64-cpython-310\pyllamacpp
copying .\pyllamacpp\_logger.py -> build\lib.win-amd64-cpython-310\pyllamacpp
copying .\pyllamacpp\__init__.py -> build\lib.win-amd64-cpython-310\pyllamacpp
running egg_info
writing pyllamacpp.egg-info\PKG-INFO
writing dependency_links to pyllamacpp.egg-info\dependency_links.txt
writing entry points to pyllamacpp.egg-info\entry_points.txt
writing top-level names to pyllamacpp.egg-info\top_level.txt
reading manifest file 'pyllamacpp.egg-info\SOURCES.txt'
reading manifest template 'MANIFEST.in'
adding license file 'LICENSE'
writing manifest file 'pyllamacpp.egg-info\SOURCES.txt'
running build_ext
-- Building for: NMake Makefiles
CMake Error at CMakeLists.txt:8 (project):
Generator
NMake Makefiles
does not support platform specification, but platform
x64
was specified.
CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
CMake Error: CMAKE_CXX_COMPILER not set, after EnableLanguage
-- Configuring incomplete, errors occurred!
Traceback (most recent call last):
File "C:\Users\Verwender\AppData\Local\Programs\Python\Python310\lib\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 363, in <module>
main()
File "C:\Users\Verwender\AppData\Local\Programs\Python\Python310\lib\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "C:\Users\Verwender\AppData\Local\Programs\Python\Python310\lib\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 261, in build_wheel
return _build_backend().build_wheel(wheel_directory, config_settings,
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\build_meta.py", line 416, in build_wheel
return self._build_with_temp_dir(['bdist_wheel'], '.whl',
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\build_meta.py", line 401, in _build_with_temp_dir
self.run_setup()
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\build_meta.py", line 338, in run_setup
exec(code, locals())
File "<string>", line 132, in <module>
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\__init__.py", line 107, in setup
return distutils.core.setup(**attrs)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\core.py", line 185, in setup
return run_commands(dist)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\core.py", line 201, in run_commands
dist.run_commands()
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 969, in run_commands
self.run_command(cmd)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\dist.py", line 1244, in run_command
super().run_command(command)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 988, in run_command
cmd_obj.run()
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\wheel\bdist_wheel.py", line 343, in run
self.run_command("build")
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\dist.py", line 1244, in run_command
super().run_command(command)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 988, in run_command
cmd_obj.run()
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\command\build.py", line 131, in run
self.run_command(cmd_name)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\dist.py", line 1244, in run_command
super().run_command(command)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\dist.py", line 988, in run_command
cmd_obj.run()
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\command\build_ext.py", line 84, in run
_build_ext.run(self)
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\command\build_ext.py", line 345, in run
self.build_extensions()
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\command\build_ext.py", line 467, in build_extensions
self._build_extensions_serial()
File "C:\Users\Verwender\AppData\Local\Temp\pip-build-env-3mp60k_f\overlay\Lib\site-packages\setuptools\_distutils\command\build_ext.py", line 493, in _build_extensions_serial
self.build_extension(ext)
File "<string>", line 118, in build_extension
File "C:\Users\Verwender\AppData\Local\Programs\Python\Python310\lib\subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['cmake', 'C:\\Users\\Verwender\\Desktop\\pyllamacpp2\\pyllamacpp', '-DCMAKE_LIBRARY_OUTPUT_DIRECTORY=C:\\Users\\Verwender\\Desktop\\pyllamacpp2\\pyllamacpp\\build\\lib.win-amd64-cpython-310\\', '-DPYTHON_EXECUTABLE=C:\\Users\\Verwender\\AppData\\Local\\Programs\\Python\\Python310\\python.exe', '-DCMAKE_BUILD_TYPE=Release', '-DEXAMPLE_VERSION_INFO=2.0.0', '-A', 'x64', '-DCMAKE_LIBRARY_OUTPUT_DIRECTORY_RELEASE=C:\\Users\\Verwender\\Desktop\\pyllamacpp2\\pyllamacpp\\build\\lib.win-amd64-cpython-310']' returned non-zero exit status 1.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for pyllamacpp
Failed to build pyllamacpp
ERROR: Could not build wheels for pyllamacpp, which is required to install pyproject.toml-based projects
WARNING: Ignoring invalid distribution -yllamacpp (c:\users\verwender\appdata\local\programs\python\python310\lib\site-packages)
WARNING: Ignoring invalid distribution -yllamacpp (c:\users\verwender\appdata\local\programs\python\python310\lib\site-packages)
WARNING: Ignoring invalid distribution -yllamacpp (c:\users\verwender\appdata\local\programs\python\python310\lib\site-packages)
[notice] A new release of pip available: 22.2.1 -> 23.1.2
[notice] To update, run: python.exe -m pip install --upgrade pip
alright i'm looking into WSL.
I think you do not have cmake
in the PATH
variable or it is not installed. Are you sure you built alpaca.cpp
yourself ?
You need to install Visual Studio C++
and Cmake first, before building.
Please run this to check if you have it:
cmake --version
I think you do not have
cmake
in thePATH
variable or it is not installed. Are you sure you builtalpaca.cpp
yourself ?You need to install
Visual Studio C++
and Cmake first, before building. Please run this to check if you have it:cmake --version
I never builded any of them myself, i simply use pre compiled versions. I'm not going to install VS c++ for using it 1 times. I simply don't have and want to use my bandwith with it. I can try to reinstall cmake with path. cmake is version 3.26.3 I installed WSL.
Also the basic problem wasn't that i wanted to compile any of these, you can check my gpt4xalpaca code for using the wrapper. https://github.com/Naugustogi/gpt4-x-Alpaca-Discord-bot/blob/main/alpacaxgpt4.py Nothing interesting, but in use its very slow. That was the main problem why i made this issue. Not compiling it myself. Note that i don't know where the wrapper takes the llama.cpp into account.
Yeah, I understand .. Sometimes things does not go as expected (especially in IT), so one needs to be patient and learn along the way.
You see, when you do pip install
, pip is just pulling a prebuilt wheel from Pypi website, the prebuilt wheels are built with a standard virtual machine (llama.cpp is the source code used to build that wheel),
So maybe it is running slow because it is not the right prebuilt wheel for your machine, that's why I told you to build it, I hope you get it now!
If you installed WSL, try the package first as usual (from pip install) and then try to build if it is still slow, I think cmake will probably be already installed.
Yeah, I understand .. Sometimes things does not go as expected (especially in IT), so one needs to be patient and learn along the way.
You see, when you do
pip install
, pip is just pulling a prebuilt wheel from Pypi website, the prebuilt wheels are built with a standard virtual machine (llama.cpp is the source code used to build that wheel), So maybe it is running slow because it is not the right prebuilt wheel for your machine, that's why I told you to build it, I hope you get it now!If you installed WSL, try the package first as usual (from pip install) and then try to build if it is still slow, I think cmake will probably be already installed.
so i did run it with wsl, the only difference is that it doesn't stop using my cpu. So it guess its a small amount faster? pip install . or cmake . (path is installed) both fail with the same compiling error, is there really no way around VS c++? Which of these should i install at all? I get that with the wheel now.
Well which one should i choose?
Only faster by a small amount, that's weird!
try to build on WSL as well ? not windows! pip install .
Only faster by a small amount, that's weird!
try to build on WSL as well ? not windows! pip install .
C:\Users\Verwender\Desktop\pyllamacpp2\pyllamacpp>wsl pip install .root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp# pip install . Processing /mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Building wheels for collected packages: pyllamacpp Building wheel for pyllamacpp (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for pyllamacpp (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [5 lines of output] running bdist_wheel running build running build_py running build_ext error: [Errno 2] No such file or directory: 'cmake' [end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pyllamacpp Failed to build pyllamacpp ERROR: Could not build wheels for pyllamacpp, which is required to install pyproject.toml-based projects root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp#
Already tryed that too with WSL
Only faster by a small amount, that's weird! try to build on WSL as well ? not windows! pip install .
C:\Users\Verwender\Desktop\pyllamacpp2\pyllamacpp>wsl pip install .root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp# pip install . Processing /mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Building wheels for collected packages: pyllamacpp Building wheel for pyllamacpp (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for pyllamacpp (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [5 lines of output] running bdist_wheel running build running build_py running build_ext error: [Errno 2] No such file or directory: 'cmake' [end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pyllamacpp Failed to build pyllamacpp ERROR: Could not build wheels for pyllamacpp, which is required to install pyproject.toml-based projects root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp#
Already tryed that too with WSL
pip install cmake
Only faster by a small amount, that's weird! try to build on WSL as well ? not windows! pip install .
C:\Users\Verwender\Desktop\pyllamacpp2\pyllamacpp>wsl pip install .root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp# pip install . Processing /mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Building wheels for collected packages: pyllamacpp Building wheel for pyllamacpp (pyproject.toml) ... error error: subprocess-exited-with-error × Building wheel for pyllamacpp (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [5 lines of output] running bdist_wheel running build running build_py running build_ext error: [Errno 2] No such file or directory: 'cmake' [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pyllamacpp Failed to build pyllamacpp ERROR: Could not build wheels for pyllamacpp, which is required to install pyproject.toml-based projects root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp# Already tryed that too with WSL
pip install cmake
root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp# pip3 install cmake Collecting cmake Using cached cmake-3.26.3-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (24.0 MB) Installing collected packages: cmake Successfully installed cmake-3.26.3 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp# pip install . Processing /mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Building wheels for collected packages: pyllamacpp Building wheel for pyllamacpp (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for pyllamacpp (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [55 lines of output]
running bdist_wheel
running build
running build_py
running build_ext
CMake Error: The current CMakeCache.txt directory /mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp/CMakeCache.txt is different than the directory c:/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp where CMakeCache.txt was created. This may result in binaries being created in the wrong place. If you are not sure, reedit the CMakeCache.txt
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pyllamacpp Failed to build pyllamacpp ERROR: Could not build wheels for pyllamacpp, which is required to install pyproject.toml-based projects root@User:/mnt/c/Users/Verwender/Desktop/pyllamacpp2/pyllamacpp#
my mistake, but i still get this result
try this instead:
pip install git+https://github.com/abdeladim-s/pyllamacpp.git
try this instead:
pip install git+https://github.com/abdeladim-s/pyllamacpp.git
That's way faster, i don't know what changed still (~40 seconds)
Copy/Paste the result of this:
from _pyllamacpp import llama_print_system_info
print(llama_print_system_info())
C:\Users\Verwender>python Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information.
from _pyllamacpp import llama_print_system_info print(llama_print_system_info()) AVX = 1 | AVX2 = 1 | AVX512 = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | VSX = 0 |
I think everything is good now. same as mine.
If you want to run one more check:
from pyllamacpp.model import Model
def new_text_callback(text: str):
print(text, end="", flush=True)
llama_config = { "n_ctx": 2048}
model = Model(ggml_model=model_path, **llama_config)
question = "Can you tell me, how did the Dutch obtain Manhattan, and what did it cost?\n"
model._generate(question, new_text_callback=new_text_callback,)
and copy/paste the results ? It will report the timings tokens/s at the end!
llama_model_load: loading model from 'gpt4-x-alpaca-13b-native-ggml-model-q4_0.bin' - please wait ... llama_model_load: n_vocab = 32001 llama_model_load: n_ctx = 2048 llama_model_load: n_embd = 5120 llama_model_load: n_mult = 256 llama_model_load: n_head = 40 llama_model_load: n_layer = 40 llama_model_load: n_rot = 128 llama_model_load: f16 = 2 llama_model_load: n_ff = 13824 llama_model_load: n_parts = 2 llama_model_load: type = 2 llama_model_load: ggml map size = 7759.84 MB llama_model_load: ggml ctx size = 101.25 KB llama_model_load: mem required = 9807.93 MB (+ 3216.00 MB per state) llama_model_load: loading tensors from 'gpt4-x-alpaca-13b-native-ggml-model-q4_0.bin' llama_model_load: model size = 7759.40 MB / num tensors = 363 llama_init_from_file: kv self size = 3200.00 MB llama_generate: seed = 1682907787
system_info: n_threads = 20 / 20 | AVX = 1 | AVX2 = 1 | AVX512 = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | VSX = 0 | sampling: temp = 0.800000, top_k = 40, top_p = 0.950000, repeat_last_n = 64, repeat_penalty = 1.100000 generate: n_ctx = 2048, n_batch = 8, n_predict = 128, n_keep = 0
Can you tell me, how did the Dutch obtain Manhattan, and what did it cost? släktet комаров с добавлением новых фотографий и видеороликов. [end of text]
llama_print_timings: load time = 12879.61 ms llama_print_timings: sample time = 19.32 ms / 22 runs ( 0.88 ms per run) llama_print_timings: prompt eval time = 16656.92 ms / 21 tokens ( 793.19 ms per token) llama_print_timings: eval time = 5862.53 ms / 21 runs ( 279.17 ms per run) llama_print_timings: total time = 24045.04 ms
ok now i don't see the speed problem anymore, had to modify to use all my cores not just 4
yes, that was my second recommendation. But now it is really fast.
Congrats :slightly_smiling_face:
Thanks for your help, the more context it has, is it like n² of time?
You can close this issue, it finally works!
Thanks for your help, the more context it has, is it like n² of time?
welcome.
Yes. well not exactly n**2 but it takes time to digest the context. especially if it is long.
Hi @abdeladim-s , i wondered if the llama cpp updated, we just run again pip install git+ ... ?
And do you know if using it in python have overhead compared to directly using llama cpp?
With llama cpp now can use prompt caching, is it supported? And if there's new parameters added in llama cpp, it will automatically ready?
Sorry for lots of questions
Hi @x4080,
Let me know if you have any other questions.
Thanks @abdeladim-s my first question refers to installation
pip install git+https://github.com/abdeladim-s/pyllamacpp.git
So I'll wait for your implementation then, maybe I'll asked again in discussion (if I need some help)
Cheers
yeah, if you want to install from source then use that command, otherwise you can pull the prebuilt binaries from PYPI
pip install pyllamacpp
Sure, feel free to ask if you need any help :)
Everything works fine, model can be loaded. But it takes a very long time before it generates something at a decent speed. (2 minutes) Sometimes it stops after generating 5-10 tokens and after 20 seconds later it proceeds.
Using 13b gpt4 x alpaca, 12gen i7 12700F (n_theads = 20), f16_kv = 1, 16gb ram (fits in)
Using alpaca.cpp it loads in like 2 seconds and generates directly after without stop.