Open jigeng123 opened 1 year ago
Loading wd14-vit-v2-git model file from SmilingWolf/wd-v1-4-vit-tagger-v2
Downloading model.onnx: 100%|███████████████████████████████████████████████████████| 373M/373M [00:33<00:00, 11.2MB/s]
Error completing request
Arguments: (<PIL.Image.Image image mode=RGB size=512x768 at 0x2491A415C60>, '', False, '', '[name].[output_extension]', 'ignore', False, False, 'wd14-vit-v2-git', 0.35, '', '', False, False, True, '00, (o)(o), ++, +-, ..,
Traceback (most recent call last): File "C:\SD\stable-diffusion-webui\venv\lib\site-packages\gradio\routes.py", line 337, in run_predict output = await app.get_blocks().process_api( File "C:\SD\stable-diffusion-webui\venv\lib\site-packages\gradio\blocks.py", line 1018, in process_api data = self.postprocess_data(fn_index, result["prediction"], state) File "C:\SD\stable-diffusion-webui\venv\lib\site-packages\gradio\blocks.py", line 935, in postprocess_data if predictions[i] is components._Keywords.FINISHED_ITERATING: IndexError: tuple index out of range
delete the file extensions/wd-v1-4-vit-tagger-v2 clone this repository under extensions/ $ git clone https://github.com/toriato/stable-diffusion-webui-wd14-tagger.git extensions/tagger
I have the same problem on my M1pro MacBook
did anybody fix it yet,ether download it by itself nor manually download it dosent work at all,still got the 'Tuple out of index error‘ ,
I have the same problem on my M1pro MacBook
In file tagger/interrogator.py,change the "onnxruntime-gpu" to "onnxruntime" It works for me
我的M1pro MacBook也有同样的问题
I have the same problem on my M1pro MacBook
解决了吗 没解决你可以试试把venv文件删除 在终端 ./webui.sh
重新下载 但得确保homebre文件中没有python3.10
I have the same problem on my M1pro MacBook
In file tagger/interrogator.py,change the "onnxruntime-gpu" to "onnxruntime" It works for me
it's solve the issue, thank you so much
I have the same problem on my M1pro MacBook
In file tagger/interrogator.py,change the "onnxruntime-gpu" to "onnxruntime" It works for me
it's works for me, thank you
the latest version, I tried to reinstall the plugin ,even removed the the whole folder from <.cache/huggingface>, but doesn't work
Loading wd14-vit-v2-git model file from SmilingWolf/wd-v1-4-vit-tagger-v2 Downloading (…)"model.onnx";: 1%|▎ | 2.03M/373M [00:01<04:24, 1.40MB/s] Downloading (…)in/selected_tags.csv: 100%|███████████████████████████████████████████| 254k/254k [00:00<00:00, 412kB/s] Error completing request Arguments: (<PIL.Image.Image image mode=RGB size=309x334 at 0x20B5F3858D0>, '', False, '', '[name].[output_extension]', 'ignore', False, False, 'wd14-vit-v2-git', 0.35, '', '', False, False, True, '00, (o)(o), ++, +-, ..,, <|><|>, ==, >_<, 3_3, 6_9, >o, @@, ^_^, o_o, u_u, xx, ||, ||_||', False, False) {}
Traceback (most recent call last):
File "C:\SSD\stable-diffusion-webui\modules\call_queue.py", line 56, in f
res = list(func(*args, *kwargs))
File "C:\SSD\stable-diffusion-webui\modules\call_queue.py", line 37, in f
res = func(args, **kwargs)
File "C:\SSD\stable-diffusion-webui\extensions\tagger\tagger\ui.py", line 69, in on_interrogate
ratings, tags = interrogator.interrogate(image)
File "C:\SSD\stable-diffusion-webui\extensions\tagger\tagger\interrogator.py", line 275, in interrogate
self.load()
File "C:\SSD\stable-diffusion-webui\extensions\tagger\tagger\interrogator.py", line 260, in load
self.model = InferenceSession(str(model_path), providers=providers)
File "C:\SSD\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 347, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\SSD\stable-diffusion-webui\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 384, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from C:\Users\jigeng.cache\huggingface\hub\models--SmilingWolf--wd-v1-4-vit-tagger-v2\snapshots\fab4cf71b7d2988e18ed3f40c17c1559e361c8f6\model.onnx failed:Protobuf parsing failed.
Traceback (most recent call last): File "C:\SSD\stable-diffusion-webui\venv\lib\site-packages\gradio\routes.py", line 337, in run_predict output = await app.get_blocks().process_api( File "C:\SSD\stable-diffusion-webui\venv\lib\site-packages\gradio\blocks.py", line 1018, in process_api data = self.postprocess_data(fn_index, result["prediction"], state) File "C:\SSD\stable-diffusion-webui\venv\lib\site-packages\gradio\blocks.py", line 935, in postprocess_data if predictions[i] is components._Keywords.FINISHED_ITERATING: IndexError: tuple index out of range