Error loading script: app.py
Traceback (most recent call last):
File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\modules\scripts.py", line 205, in load_scripts
module = script_loading.load_module(scriptfile.path)
File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\modules\script_loading.py", line 13, in load_module
exec(compiled, module.dict)
File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\extensions\ABG_extension\scripts\app.py", line 18, in
rmbg_model = rt.InferenceSession(model_path, providers=providers)
File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 347, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 395, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:574 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/reference/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
Error loading script: app.py Traceback (most recent call last): File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\modules\scripts.py", line 205, in load_scripts module = script_loading.load_module(scriptfile.path) File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\modules\script_loading.py", line 13, in load_module exec(compiled, module.dict) File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\extensions\ABG_extension\scripts\app.py", line 18, in
rmbg_model = rt.InferenceSession(model_path, providers=providers)
File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 347, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Users\coron\OneDrive\デスクトップ\stable diffusion\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 395, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:574 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/reference/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.