Mikubill / sd-webui-controlnet

WebUI extension for ControlNet
GNU General Public License v3.0
17.07k stars 1.96k forks source link

InstantID problem with webUI : v1.8.0 #2687

Closed dhonta closed 7 months ago

dhonta commented 8 months ago

Hi, Please could you help me with this problem. When i try to use InstantID i've got an error :

2024-03-07 19:20:31,962 - ControlNet - INFO - preprocessor resolution = 512 2024-03-07 19:20:32.0351307 [E:onnxruntime:Default, provider_bridge_ort.cc:1548 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1209 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

EP Error EP Error D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported. when using ['CUDAExecutionProvider', 'CPUExecutionProvider'] Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.


2024-03-07 19:20:32.1120865 [E:onnxruntime:Default, provider_bridge_ort.cc:1548 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1209 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

*** Error running process: E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py Traceback (most recent call last): File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init self._create_inference_session(providers, provider_options, disabled_optimizers) File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session sess.initialize_session(providers, provider_options, disabled_optimizers) RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\modules\scripts.py", line 784, in process
    script.process(p, *script_args)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1279, in process
    self.controlnet_hack(p)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1264, in controlnet_hack
    self.controlnet_main_entry(p)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1029, in controlnet_main_entry
    controls, hr_controls = list(zip(*[preprocess_input_image(img) for img in optional_tqdm(input_images)]))
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1029, in <listcomp>
    controls, hr_controls = list(zip(*[preprocess_input_image(img) for img in optional_tqdm(input_images)]))
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 986, in preprocess_input_image
    detected_map, is_image = self.preprocessor[unit.module](
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\utils.py", line 81, in decorated_func
    return cached_func(*args, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\utils.py", line 65, in cached_func
    return func(*args, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\global_state.py", line 37, in unified_preprocessor
    return preprocessor_modules[preprocessor_name](*args, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\processor.py", line 801, in run_model_instant_id
    self.load_model()
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\extensions\sd-webui-controlnet\scripts\processor.py", line 741, in load_model
    self.model = FaceAnalysis(
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\insightface\app\face_analysis.py", line 31, in __init__
    model = model_zoo.get_model(onnx_file, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
    model = router.get_model(providers=providers, provider_options=provider_options)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
    session = PickableInferenceSession(self.onnx_file, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in __init__
    super().__init__(model_path, **kwargs)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 432, in __init__
    raise fallback_error from e
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 427, in __init__
    self._create_inference_session(self._fallback_providers, None)
  File "E:\Programmes\AI\Stable_diffusion_10_8\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
    sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page  (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements),  make sure they're in the PATH, and that your GPU is supported.
huchenlei commented 8 months ago

This seems like a problem with your onnx-runtime setup. Does this issue start to happen after A1111 v1.8.0?

dhonta commented 8 months ago

This seems like a problem with your onnx-runtime setup. Does this issue start to happen after A1111 v1.8.0?

Thanks for your help. Yes, I have no problem with version 1.7.0 only when i use V1.8.0.

mnbv7758 commented 7 months ago

I also have the same problem.

huchenlei commented 7 months ago

onnx runtime issue should be solved by https://github.com/Mikubill/sd-webui-controlnet/pull/2761.