Open 26medias opened 8 months ago
I apologize for the inconvenience, but what happens if you change langchain.callbacks.manager
to langchain_core.callbacks.manager
in the relevant line?
Same error here:
Error running install.py for extension ...\stable-diffusion-webui\extensions\sd-webui-chatgpt. Command: "...\stable-diffusion-webui\venv\Scripts\python.exe" "...\stable-diffusion-webui\extensions\sd-webui-chatgpt\install.py" Error code: 1 stdout: Installing llama-cpp-python
stderr: Traceback (most recent call last):
File "...\stable-diffusion-webui\extensions\sd-webui-chatgpt\install.py", line 28, in
stderr: ERROR: HTTP error 404 while getting https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl ERROR: Could not install requirement llama-cpp-python==0.2.36+cu2.0.1 from https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl because of HTTP error 404 Client Error: Not Found for url: https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl for URL https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl
PD: I replaced my directory with "..."
I fixed the behavior when the cuda version cannot be obtained or does not exist in 1a7222ca7710c8e21956f3ddc2e18cd4d57f142f. Please check if it works correctly on your side.
Installed from webui's "install" button.