Mikubill / sd-webui-controlnet

WebUI extension for ControlNet
GNU General Public License v3.0
16.41k stars 1.9k forks source link

Error when trying to use any ip-adapter-faceid models. #2962

Closed nternan closed 23 hours ago

nternan commented 1 week ago

hello, i keep getting an error when trying to use any ip-adapter-faceid models. other ip-adapter models will work but not those. My GPU Drivers are up to date, and so is my Webui. I deleted and reinstalled venv, but this still occurs. i copied what error occurs in attached doc. Any advice? Thanks message.txt

INTstinkt commented 1 day ago

i think i have the exact same issue but only on my second drive F: on C: it still works. searched for a solution to set the path to C: folder but didnt find anything

2024-06-30 00:36:38,485 - ControlNet - INFO - unit_separate = False, style_align = False
2024-06-30 00:36:38,727 - ControlNet - INFO - Loading model: ip-adapter-faceid-plusv2_sdxl [187cb962]
2024-06-30 00:36:38,993 - ControlNet - INFO - Loaded state_dict from [F:\stable-diffusion-webui\models\ControlNet\ip-adapter-faceid-plusv2_sdxl.bin]
2024-06-30 00:36:41,270 - ControlNet - INFO - ControlNet model ip-adapter-faceid-plusv2_sdxl [187cb962](ControlModelType.IPAdapter) loaded.
2024-06-30 00:36:41,272 - ControlNet - INFO - Using preprocessor: ip-adapter-auto
2024-06-30 00:36:41,272 - ControlNet - INFO - preprocessor resolution = 512
2024-06-30 00:36:41,272 - ControlNet - INFO - ip-adapter-auto => ip-adapter_face_id_plus
2024-06-30 00:36:41.5786723 [E:onnxruntime:Default, provider_bridge_ort.cc:1744 onnxruntime::TryGetProviderInfo_CUDA] C:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "F:\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

*************** EP Error ***************
EP Error C:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:866 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page  (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements),  make sure they're in the PATH, and that your GPU is supported.
 when using ['CUDAExecutionProvider', 'CPUExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.
****************************************
2024-06-30 00:36:41.6417906 [E:onnxruntime:Default, provider_bridge_ort.cc:1744 onnxruntime::TryGetProviderInfo_CUDA] C:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "F:\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

*** Error running process: F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py
    Traceback (most recent call last):
      File "F:\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
        self._create_inference_session(providers, provider_options, disabled_optimizers)
      File "F:\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
        sess.initialize_session(providers, provider_options, disabled_optimizers)
    RuntimeError: C:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:866 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page  (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements),  make sure they're in the PATH, and that your GPU is supported.

    The above exception was the direct cause of the following exception:

    Traceback (most recent call last):
      File "F:\stable-diffusion-webui\modules\scripts.py", line 825, in process
        script.process(p, *script_args)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1222, in process
        self.controlnet_hack(p)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1207, in controlnet_hack
        self.controlnet_main_entry(p)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 941, in controlnet_main_entry
        controls, hr_controls, additional_maps = get_control(
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 290, in get_control
        controls, hr_controls = list(zip(*[preprocess_input_image(img) for img in optional_tqdm(input_images)]))
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 290, in <listcomp>
        controls, hr_controls = list(zip(*[preprocess_input_image(img) for img in optional_tqdm(input_images)]))
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 242, in preprocess_input_image
        result = preprocessor.cached_call(
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\supported_preprocessor.py", line 196, in cached_call
        result = self._cached_call(input_image, *args, **kwargs)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\utils.py", line 82, in decorated_func
        return cached_func(*args, **kwargs)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\utils.py", line 66, in cached_func
        return func(*args, **kwargs)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\supported_preprocessor.py", line 209, in _cached_call
        return self(*args, **kwargs)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\preprocessor\ip_adapter_auto.py", line 25, in __call__
        return p(*args, **kwargs)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\preprocessor\legacy\legacy_preprocessors.py", line 105, in __call__
        result, is_image = self.call_function(
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\preprocessor\legacy\processor.py", line 749, in face_id_plus
        face_embed, _ = g_insight_face_model.run_model(img)
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\preprocessor\legacy\processor.py", line 677, in run_model
        self.load_model()
      File "F:\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\preprocessor\legacy\processor.py", line 669, in load_model
        self.model = FaceAnalysis(
      File "F:\stable-diffusion-webui\venv\lib\site-packages\insightface\app\face_analysis.py", line 31, in __init__
        model = model_zoo.get_model(onnx_file, **kwargs)
      File "F:\stable-diffusion-webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
        model = router.get_model(providers=providers, provider_options=provider_options)
      File "F:\stable-diffusion-webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
        session = PickableInferenceSession(self.onnx_file, **kwargs)
      File "F:\stable-diffusion-webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in __init__
        super().__init__(model_path, **kwargs)
      File "F:\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 432, in __init__
        raise fallback_error from e
      File "F:\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 427, in __init__
        self._create_inference_session(self._fallback_providers, None)
      File "F:\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
        sess.initialize_session(providers, provider_options, disabled_optimizers)
    RuntimeError: C:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:866 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page  (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements),  make sure they're in the PATH, and that your GPU is supported.

Edit: Nevermind i dont think its the same problem. fix for me was: Go to A1111 folder open cmd type in venv\scripts\activate When you see (venv) X:\path-to-webui\stable-diffusion-webui> pip install insightface pip install pydantic pip install albumentations pip install onnxruntime pip install pip install onnxruntime-gpu

Then deactivate venv by command venv\scripts\deactivate

for whatever reason i had to install onnxruntime 2 times and then it worked

nternan commented 23 hours ago

it appears mine was a cuda conflict, so i removed cuda and reinstalled venv