Open andypotato opened 9 months ago
Thanks for your feedback, i will try to test and fix that when i have time.
fix?
Workaround for now:
Make sure you KEEP onnxruntime-gpu in requirements-gpu.txt and don't uninstall it.
Workaround for now:
- Activate venv
- Uninstall onnxruntime
- Remove requirement from requirements-gpu.txt to prevent automatic reinstall on startup
Make sure you KEEP onnxruntime-gpu in requirements-gpu.txt and don't uninstall it.
I already uninstall onnxruntime, but got error
Error loading script: faceswaplab.py
Traceback (most recent call last):
File "/workspace/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
script_module = script_loading.load_module(scriptfile.path)
File "/workspace/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
module_spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/workspace/stable-diffusion-webui/extensions/sd-webui-faceswaplab/scripts/faceswaplab.py", line 10, in <module>
from scripts.faceswaplab_api import faceswaplab_api
File "/workspace/stable-diffusion-webui/extensions/sd-webui-faceswaplab/scripts/faceswaplab_api/faceswaplab_api.py", line 12, in <module>
from scripts.faceswaplab_swapping import swapper
File "/workspace/stable-diffusion-webui/extensions/sd-webui-faceswaplab/scripts/faceswaplab_swapping/swapper.py", line 14, in <module>
import insightface
File "/root/miniconda3/envs/diffusion/lib/python3.10/site-packages/insightface/__init__.py", line 16, in <module>
from . import model_zoo
File "/root/miniconda3/envs/diffusion/lib/python3.10/site-packages/insightface/model_zoo/__init__.py", line 1, in <module>
from .model_zoo import get_model
File "/root/miniconda3/envs/diffusion/lib/python3.10/site-packages/insightface/model_zoo/model_zoo.py", line 22, in <module>
class PickableInferenceSession(onnxruntime.InferenceSession):
AttributeError: module 'onnxruntime' has no attribute 'InferenceSession'
With your venv activated, start up python and type:
import onnxruntime as rt
rt.get_device()
This will print out your backend, either GPU
or CPU
. In case you get on error like "no attribute" then your onnxruntime is not properly installed.
With your venv activated, start up python and type:
import onnxruntime as rt rt.get_device()
This will print out your backend, either
GPU
orCPU
. In case you get on error like "no attribute" then your onnxruntime is not properly installed.
I already uninstall onnxruntime, but I already install onnxruntime-gpu, I still get the error.
In
requirements-gpu.txt
the requirement foronnxruntime
should be removed. Only keeponnxruntime-gpu
Installing both
onnxruntime
andonnxruntime-gpu
leads to undefined behavior as the CUDAExecutionProvider might not be available despite the GPU runtime package being installed. If both are installed, onnxruntime.get_device() may just randomly return "CPU" or "GPU".Actually
onnxruntime-gpu
also contains the CPUExecutionProvider as fallback.