glucauze / sd-webui-faceswaplab

Extended faceswap extension for StableDiffusion web-ui with multiple faceswaps, inpainting, checkpoints, ....
https://glucauze.github.io/sd-webui-faceswaplab/
GNU Affero General Public License v3.0
707 stars 92 forks source link

Do not install both onnxruntime and onnxruntime-gpu #124

Open andypotato opened 9 months ago

andypotato commented 9 months ago

In requirements-gpu.txt the requirement for onnxruntime should be removed. Only keep onnxruntime-gpu

Installing both onnxruntime and onnxruntime-gpu leads to undefined behavior as the CUDAExecutionProvider might not be available despite the GPU runtime package being installed. If both are installed, onnxruntime.get_device() may just randomly return "CPU" or "GPU".

Actually onnxruntime-gpu also contains the CPUExecutionProvider as fallback.

glucauze commented 9 months ago

Thanks for your feedback, i will try to test and fix that when i have time.

KINGLIFER commented 9 months ago

fix?

andypotato commented 9 months ago

Workaround for now:

Make sure you KEEP onnxruntime-gpu in requirements-gpu.txt and don't uninstall it.

venshine commented 8 months ago

Workaround for now:

  • Activate venv
  • Uninstall onnxruntime
  • Remove requirement from requirements-gpu.txt to prevent automatic reinstall on startup

Make sure you KEEP onnxruntime-gpu in requirements-gpu.txt and don't uninstall it.

I already uninstall onnxruntime, but got error

Error loading script: faceswaplab.py
    Traceback (most recent call last):
      File "/workspace/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "/workspace/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "/workspace/stable-diffusion-webui/extensions/sd-webui-faceswaplab/scripts/faceswaplab.py", line 10, in <module>
        from scripts.faceswaplab_api import faceswaplab_api
      File "/workspace/stable-diffusion-webui/extensions/sd-webui-faceswaplab/scripts/faceswaplab_api/faceswaplab_api.py", line 12, in <module>
        from scripts.faceswaplab_swapping import swapper
      File "/workspace/stable-diffusion-webui/extensions/sd-webui-faceswaplab/scripts/faceswaplab_swapping/swapper.py", line 14, in <module>
        import insightface
      File "/root/miniconda3/envs/diffusion/lib/python3.10/site-packages/insightface/__init__.py", line 16, in <module>
        from . import model_zoo
      File "/root/miniconda3/envs/diffusion/lib/python3.10/site-packages/insightface/model_zoo/__init__.py", line 1, in <module>
        from .model_zoo import get_model
      File "/root/miniconda3/envs/diffusion/lib/python3.10/site-packages/insightface/model_zoo/model_zoo.py", line 22, in <module>
        class PickableInferenceSession(onnxruntime.InferenceSession):
    AttributeError: module 'onnxruntime' has no attribute 'InferenceSession'
andypotato commented 8 months ago

With your venv activated, start up python and type:

import onnxruntime as rt
rt.get_device()

This will print out your backend, either GPU or CPU. In case you get on error like "no attribute" then your onnxruntime is not properly installed.

venshine commented 8 months ago

With your venv activated, start up python and type:

import onnxruntime as rt
rt.get_device()

This will print out your backend, either GPU or CPU. In case you get on error like "no attribute" then your onnxruntime is not properly installed.

I already uninstall onnxruntime, but I already install onnxruntime-gpu, I still get the error.