Open liaceboy opened 4 months ago
Probably need to update transformers
I have the same error - but strangely it only occurs intermittently. It works several runs, then after I use other workflows, I'll come back to Florence2 and it will suddenly not work. It's fickle somehow. My transformers is updated to 4.42.3
but I still get sometimes:
Error occurred when executing DownloadAndLoadFlorence2Model:
Failed to import transformers.models.cohere.configuration_cohere because of the following error (look up to see its traceback): No module named 'transformers.models.cohere.configuration_cohere'
Well it pretty much has to be transformers version mismatch, Cohere was added 4 months ago: https://github.com/huggingface/transformers/commits/main/src/transformers/models/cohere/configuration_cohere.py
Maybe I should up the minimum version to 4.39.0 as that's when it was added.
But it's strange - sometimes it will work, sometimes not. I have not restarted the server. It just sometimes works, sometimes throws the error. And I already have a higher version of transformers than 4.39.0
Maybe another node or process causes it to throw the error? I notice that when it works, it will stop when I come back to the workflow after using other workflows.
by the way, I am using without the flash attention installed. I use the sdpa mode
I now have this issue too, however mine occurred as a result of upgrading from transformers-4.38.2 to transformers-4.42.3 I did this to try and resolve a different issue I was having with this node but now I get the same error as everyone above.
Error occurred when executing DownloadAndLoadFlorence2Model:
Failed to import transformers.models.cohere.configuration_cohere because of the following error (look up to see its traceback):
No module named 'transformers.models.cohere.configuration_cohere'
File "D:\aitools\ComfyUI\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "D:\aitools\ComfyUI\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "D:\aitools\ComfyUI\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "D:\aitools\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Florence2\nodes.py", line 85, in loadmodel
model = AutoModelForCausalLM.from_pretrained(model_path, attn_implementation=attention, device_map=device, torch_dtype=dtype,trust_remote_code=True)
File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 541, in from_pretrained
)
File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 752, in keys
if key in self._model_mapping.keys()
File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 753, in
]
File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 749, in _load_attr_from_module
mapping_keys = [
File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 693, in getattribute_from_module
# object at the top level.
File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1550, in __getattr__
File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1562, in _get_module
Just to add, I have now downgraded back to Transformers 4.38.2 and this error has now gone.
I now have this issue too, however mine occurred as a result of upgrading from transformers-4.38.2 to transformers-4.42.3 I did this to try and resolve a different issue I was having with this node but now I get the same error as everyone above.
Error occurred when executing DownloadAndLoadFlorence2Model: Failed to import transformers.models.cohere.configuration_cohere because of the following error (look up to see its traceback): No module named 'transformers.models.cohere.configuration_cohere' File "D:\aitools\ComfyUI\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "D:\aitools\ComfyUI\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "D:\aitools\ComfyUI\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "D:\aitools\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Florence2\nodes.py", line 85, in loadmodel model = AutoModelForCausalLM.from_pretrained(model_path, attn_implementation=attention, device_map=device, torch_dtype=dtype,trust_remote_code=True) File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 541, in from_pretrained ) File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 752, in keys if key in self._model_mapping.keys() File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 753, in ] File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 749, in _load_attr_from_module mapping_keys = [ File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 693, in getattribute_from_module # object at the top level. File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1550, in __getattr__ File "C:\Users\matde\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1562, in _get_module
Just to add, I have now downgraded back to Transformers 4.38.2 and this error has now gone.
I can with 100% certainty tell you that this runs with 4.42.3 just fine, there's something else going on, to be absolutely sure I tested with portable ComfyUI:
The problem remains
I found that mixlab nodes was at least for me interfering with florence2. Uninstalling Mixlabs fixed my issues.
its mine, and error
I found that mixlab nodes was at least for me interfering with florence2. Uninstalling Mixlabs fixed my issues.
It worked. Thank you.
...is mine, and error.
Florence2 wants 4.39.0 according to requirements.txt. Problem is... When I installed that earlier my ComfyUI stopped starting at all 🥵 I have resolved this now by going back to 4.38.2, but feel I'm stuck between a rock and a hard place because ComfyUI doesn't seem to like 4.39.0 and Florence2ModelLoader doesn't like 4.38.2 😥
Upgraded my transformers to 4.44.0. Same error :'(
Error occurred when executing Florence2ModelLoader:
Failed to import transformers.models.cohere.configuration_cohere because of the following error (look up to see its traceback):
No module named 'transformers.models.cohere.configuration_cohere'
File "C:\ComfyUI_P\ComfyUI\execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_P\ComfyUI\execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_P\ComfyUI\execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_P\ComfyUI\custom_nodes\ComfyUI-Florence2\nodes.py", line 166, in loadmodel
model = AutoModelForCausalLM.from_pretrained(model_path, attn_implementation=attention, device_map=device, torch_dtype=dtype,trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_P\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 541, in from_pretrained
)
File "C:\ComfyUI_P\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 752, in keys
if key in self._model_mapping.keys()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_P\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 753, in
]
File "C:\ComfyUI_P\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 749, in _load_attr_from_module
mapping_keys = [
^^^^^^^^^^
File "C:\ComfyUI_P\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 693, in getattribute_from_module
# object at the top level.
^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_P\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1593, in __getattr__
File "C:\ComfyUI_P\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1605, in _get_module```
Out of nowhere it suddenly works...
I updated ComfyUI via ComfyUI-manager (Which apparently downgraded to transformers 4.38.2 again) and restarted and suddenly Florence2 works again... I have no idea what's going on here 🤔
I have the same problem. I run Comfyui with Stability Matrix. Everytime some Comfyui update drops, I do the update, then launch my workflow, the error occurs 100% of the time as soon as I launch the first run.
The only workaround I found was to go to the comfy manager, do some "update all", then relaunch Comfyui server and relaunch first run --> Then it works 100% of the time.
Conclusion : Some update through the manager interface seems to be needed for redownloading some Florence dependency, but the remaining issue is that every freakin Comfyui update through a git or a stability matrix update re-breaks said dependency everytime, so it is an endless cycle unless some patch is deployed for that.
It's a bit annoying, but fixable.
...is mine, and error.
Florence2 wants 4.39.0 according to requirements.txt. Problem is... When I installed that earlier my ComfyUI stopped starting at all 🥵 I have resolved this now by going back to 4.38.2, but feel I'm stuck between a rock and a hard place because ComfyUI doesn't seem to like 4.39.0 and Florence2ModelLoader doesn't like 4.38.2 😥
i solved my problem,, its because missmatch (conflict dependecies,,, i use pipdeptree to check it.... : pip install pipdeptree and than run it by type : pipdeptree.. u can see to check ur conflict inside it
ComfyUI itself works fine with even latest transformers, your issue is some other custom node using it. There are some that even force install different version. It's a bad practice and nothing I can do about that.
Error occurred when executing DownloadAndLoadFlorence2Model:
Failed to import transformers.models.cohere.configuration_cohere because of the following error (look up to see its traceback): No module named 'transformers.models.cohere.configuration_cohere'
File "D:\webui\comfyui\execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\webui\comfyui\execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\webui\comfyui\execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\webui\comfyui\custom_nodes\ComfyUI-Florence2\nodes.py", line 90, in loadmodel model = AutoModelForCausalLM.from_pretrained(model_path, attn_implementation=attention, device_map=device, torch_dtype=dtype,trust_remote_code=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\webui\comfyui.ext\Lib\site-packages\transformers\models\auto\auto_factory.py", line 541, in from_pretrained )
File "D:\webui\comfyui.ext\Lib\site-packages\transformers\models\auto\auto_factory.py", line 752, in keys if key in self._model_mapping.keys() ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\webui\comfyui.ext\Lib\site-packages\transformers\models\auto\auto_factory.py", line 753, in ]
File "D:\webui\comfyui.ext\Lib\site-packages\transformers\models\auto\auto_factory.py", line 749, in _load_attr_from_module mapping_keys = [ ^^^^^^^^^^ File "D:\webui\comfyui.ext\Lib\site-packages\transformers\models\auto\auto_factory.py", line 693, in getattribute_from_module
^^^^^^^^^^^^^^^^^^^^^ File "D:\webui\comfyui.ext\Lib\site-packages\transformers\utils\import_utils.py", line 1593, in getattr File "D:\webui\comfyui.ext\Lib\site-packages\transformers\utils\import_utils.py", line 1605, in _get_module
Failed to import transformers.models.cohere.configuration_cohere because of the following error (look up to see its traceback): No module named 'transformers.models.cohere.configuration_cohere'