100%|████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:11<00:00, 11.14s/it]
23:24:02 - ReActor - STATUS - Working: source face index [0], target face index [0] | 0/1 [00:00<?, ?it/s]
23:24:02 - ReActor - STATUS - Using Loaded Source Face Model: julian-blend.safetensors
23:24:02 - ReActor - STATUS - Analyzing Target Image...
2024-03-05 23:24:02.2239947 [E:onnxruntime:Default, provider_bridge_ort.cc:1548 onnxruntime::TryGetProviderInfo_CUDA] D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1209 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\AItest\new\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
*************** EP Error ***************
EP Error D:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
when using ['CUDAExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.
****************************************
2024-03-05 23:24:02.3368121 [E:onnxruntime:Default, provider_bridge_ort.cc:1548 onnxruntime::TryGetProviderInfo_CUDA] D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1209 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\AItest\new\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
*** Error running postprocess_image: C:\AItest\new\webui\extensions\sd-webui-reactor\scripts\reactor_faceswap.py
Traceback (most recent call last):
File "C:\AItest\new\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\AItest\new\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\AItest\new\webui\modules\scripts.py", line 856, in postprocess_image
script.postprocess_image(p, pp, *script_args)
File "C:\AItest\new\webui\extensions\sd-webui-reactor\scripts\reactor_faceswap.py", line 450, in postprocess_image
result, output, swapped = swap_face(
File "C:\AItest\new\webui\extensions\sd-webui-reactor\scripts\reactor_swapper.py", line 594, in swap_face
target_faces = analyze_faces(target_img, det_thresh=detection_options.det_thresh, det_maxnum=detection_options.det_maxnum)
File "C:\AItest\new\webui\extensions\sd-webui-reactor\scripts\reactor_swapper.py", line 302, in analyze_faces
face_analyser = copy.deepcopy(getAnalysisModel())
File "C:\AItest\new\webui\extensions\sd-webui-reactor\scripts\reactor_swapper.py", line 145, in getAnalysisModel
ANALYSIS_MODEL = insightface.app.FaceAnalysis(
File "C:\AItest\new\webui\extensions\sd-webui-reactor\scripts\console_log_patch.py", line 48, in patched_faceanalysis_init
model = model_zoo.get_model(onnx_file, **kwargs)
File "C:\AItest\new\webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
model = router.get_model(providers=providers, provider_options=provider_options)
File "C:\AItest\new\webui\extensions\sd-webui-reactor\scripts\console_log_patch.py", line 21, in patched_get_model
session = PickableInferenceSession(self.onnx_file, **kwargs)
File "C:\AItest\new\webui\venv\lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in __init__
super().__init__(model_path, **kwargs)
File "C:\AItest\new\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 432, in __init__
raise fallback_error from e
File "C:\AItest\new\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 427, in __init__
self._create_inference_session(self._fallback_providers, None)
File "C:\AItest\new\webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 483, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:857 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
---
Total progress: 100%|████████████████████████████████████████████████████████████████████| 1/1 [00:39<00:00, 39.24s/it]
Total progress: 100%|████████████████████████████████████████████████████████████████████| 1/1 [00:39<00:00, 39.24s/it]
First, confirm
What happened?
update A1111 to 1.8 then can't not use reactor
Steps to reproduce the problem
Sysinfo
sysinfo-2024-03-05-15-27.json
Relevant console log
Additional information
can use before A1111 1.7 version