NON906 / sd-webui-chatgpt

This is a repository for conversations using OpenAI API (compatible with ChatGPT) or llama.cpp in Stable Diffusion web UI.
MIT License
40 stars 7 forks source link

Install not working #11

Open 26medias opened 7 months ago

26medias commented 7 months ago
*** Error loading script: langchainapi.py
    Traceback (most recent call last):
      File "/home/julien/Projects/cloned/stable-diffusion-webui/modules/scripts.py", line 469, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "/home/julien/Projects/cloned/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "/home/julien/Projects/cloned/stable-diffusion-webui/extensions/sd-webui-chatgpt/scripts/langchainapi.py", line 31, in <module>
        from langchain.callbacks.manager import AsyncCallbackManager
    ModuleNotFoundError: No module named 'langchain.callbacks.manager'

---
*** Error loading script: main.py
    Traceback (most recent call last):
      File "/home/julien/Projects/cloned/stable-diffusion-webui/modules/scripts.py", line 469, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "/home/julien/Projects/cloned/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "/home/julien/Projects/cloned/stable-diffusion-webui/extensions/sd-webui-chatgpt/scripts/main.py", line 22, in <module>
        from scripts import chatgptapi, langchainapi
      File "/home/julien/Projects/cloned/stable-diffusion-webui/extensions/sd-webui-chatgpt/scripts/langchainapi.py", line 31, in <module>
        from langchain.callbacks.manager import AsyncCallbackManager
    ModuleNotFoundError: No module named 'langchain.callbacks.manager'

Installed from webui's "install" button.

NON906 commented 7 months ago

I apologize for the inconvenience, but what happens if you change langchain.callbacks.manager to langchain_core.callbacks.manager in the relevant line?

74573 commented 7 months ago

Same error here:

Error running install.py for extension ...\stable-diffusion-webui\extensions\sd-webui-chatgpt. Command: "...\stable-diffusion-webui\venv\Scripts\python.exe" "...\stable-diffusion-webui\extensions\sd-webui-chatgpt\install.py" Error code: 1 stdout: Installing llama-cpp-python


stderr: Traceback (most recent call last): File "...\stable-diffusion-webui\extensions\sd-webui-chatgpt\install.py", line 28, in launch.run_pip('install https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu' + cuda_version + '-cp310-cp310-win_amd64.whl', 'llama-cpp-python') File "...\stable-diffusion-webui\modules\launch_utils.py", line 143, in run_pip return run(f'"{python}" -m pip {command} --prefer-binary{index_url_line}', desc=f"Installing {desc}", errdesc=f"Couldn't install {desc}", live=live) File "...\stable-diffusion-webui\modules\launch_utils.py", line 115, in run raise RuntimeError("\n".join(error_bits)) RuntimeError: Couldn't install llama-cpp-python. Command: "...\stable-diffusion-webui\venv\Scripts\python.exe" -m pip install https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl --prefer-binary Error code: 1 *** stdout: Collecting llama-cpp-python==0.2.36+cu2.0.1


stderr: ERROR: HTTP error 404 while getting https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl ERROR: Could not install requirement llama-cpp-python==0.2.36+cu2.0.1 from https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl because of HTTP error 404 Client Error: Not Found for url: https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl for URL https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/wheels/llama_cpp_python-0.2.36+cu2.0.1-cp310-cp310-win_amd64.whl


PD: I replaced my directory with "..."

NON906 commented 7 months ago

I fixed the behavior when the cuda version cannot be obtained or does not exist in 1a7222ca7710c8e21956f3ddc2e18cd4d57f142f. Please check if it works correctly on your side.