Open SKSaki opened 2 weeks ago
Hello. I hope you are well. Thank you for sharing your problem with us.
Regarding the error you are encountering, I must mention that it is a common issue in this process, and there is no need for concern.
Yuna has been developed using many new techniques; consequently, some essential software components may not yet be installed on your system because they are less known.
Firstly, I recommend using the latest version of Yuna.
Secondly, you must ensure that the Visual Studio Build Tools are fully installed on your system. These include various fundamental software components required by the Yuna project.
Additionally, if your system does not possess robust graphics capabilities, you should run Yuna using a CPU. For this, please refer to the instructions in the README file.
Hello, @SKSaki! Dev team here! To fix your issue, please try the following:
fast
mode.Documentation: https://llama-cpp-python.readthedocs.io/en/latest/
Installing NVIDIA dependencies... Collecting llama-cpp-python Using cached llama_cpp_python-0.2.78.tar.gz (50.2 MB) Installing build dependencies ... done Getting requirements to build wheel ... done Installing backend dependencies ... done Preparing metadata (pyproject.toml) ... done Requirement already satisfied: jinja2>=2.11.3 in c:\users\pcmr\appdata\local\programs\python\python310\lib\site-packages (from llama-cpp-python) (3.1.2) Requirement already satisfied: numpy>=1.20.0 in c:\users\pcmr\appdata\local\programs\python\python310\lib\site-packages (from llama-cpp-python) (1.25.2) Requirement already satisfied: typing-extensions>=4.5.0 in c:\users\pcmr\appdata\local\programs\python\python310\lib\site-packages (from llama-cpp-python) (4.8.0) Collecting diskcache>=5.6.1 Using cached diskcache-5.6.3-py3-none-any.whl (45 kB) Requirement already satisfied: MarkupSafe>=2.0 in c:\users\pcmr\appdata\local\programs\python\python310\lib\site-packages (from jinja2>=2.11.3->llama-cpp-python) (2.1.3) Building wheels for collected packages: llama-cpp-python Building wheel for llama-cpp-python (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [20 lines of output] scikit-build-core 0.9.6 using CMake 3.29.5 (wheel) Configuring CMake... 2024-06-17 14:07:19,715 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None loading initial cache file C:\Users\PCMR\AppData\Local\Temp\tmpis4l9h76\build\CMakeInit.txt -- Building for: NMake Makefiles CMake Error at CMakeLists.txt:3 (project): Running
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects