Closed PrasanthChowhan closed 3 months ago
@PrasanthChowhan, can you please post the output from the console window, it contains a bit more information. 2nd, please run
D:\UTILITY\ComfyUI_windows_portable\python_embedded\python.exe -m pip freeze | findstr "timm einops transformers"
and post the output, so I can try to reproduce the problem.
I have the same error
ERROR:root:Traceback (most recent call last): File "A:\ComfyUI_windows_portable\ComfyUI\execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "A:\ComfyUI_windows_portable\ComfyUI\execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "A:\ComfyUI_windows_portable\ComfyUI\execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(*slice_dict(input_data_all, i))) File "A:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Hangover-Moondream\ho_moondream.py", line 56, in interrogate self.model = AutoModelForCausalLM.from_pretrained(huggingface_model, trust_remote_code=trust_remote_code).to(dev) File "A:\ComfyUI_windows_portable\python_embeded\lib\site-packages\transformers\models\auto\auto_factory.py", line 561, in from_pretrained return model_class.from_pretrained( File "A:\ComfyUI_windows_portable\python_embeded\lib\site-packages\transformers\modeling_utils.py", line 3462, in from_pretrained model = cls(config, model_args, **model_kwargs) File "C:\Users\Mr Peculiar.cache\huggingface\modules\transformers_modules\vikhyatk\moondream1\f6e9da68e8f1b78b8f3ee10905d56826db7a5802\moondream.py", line 16, in init self.vision_encoder = VisionEncoder() File "C:\Users\Mr Peculiar.cache\huggingface\modules\transformers_modules\vikhyatk\moondream1\f6e9da68e8f1b78b8f3ee10905d56826db7a5802\vision_encoder.py", line 98, in init VisualHolder(timm.create_model("vit_so400m_patch14_siglip_384")) File "A:\ComfyUI_windows_portable\python_embeded\lib\site-packages\timm\models\factory.py", line 67, in create_model raise RuntimeError('Unknown model (%s)' % model_name) RuntimeError: Unknown model (vit_so400m_patch14_siglip_384)
A:\ComfyUI_windows_portable\python_embeded>python.exe -m pip freeze | findstr "timm einops transformers" einops==0.7.0 taming-transformers==0.0.1 timm==0.6.13 transformers==4.36.2
You need timm>=0.9.12. It can be upgraded with
python.exe -m pip install timm --upgrade
within the python_embeded
folder. But be aware that this could potentially break other nodes that need an older version of timm to work properly, so use at your own risk.
You need timm>=0.9.12. It can be upgraded with
python.exe -m pip install timm --upgrade
within thepython_embeded
folder. But be aware that this could potentially break other nodes that need an older version of timm to work properly, so use at your own risk.
Can you tell me where the models downloaded from huggingface are placed? I have been unable to connect to huggingface using this node and cannot download models, but my web page can connect to huggingface and download models.
The models are stored in .cache\huggingface\hub\ in the users home directory (on Windows). The problem is most probably the transformers package, it must be >=4.36.0.
The models are stored in .cache\huggingface\hub\ in the users home directory (on Windows). The problem is most probably the transformers package, it must be >=4.36.0.
can the model path be reconfigured in any file? if not, where exactly should i put the downloaded model, right under hub path or a folder beneath it ?
The models are stored in .cache\huggingface\hub\ in the users home directory (on Windows). The problem is most probably the transformers package, it must be >=4.36.0.
can the model path be reconfigured in any file? if not, where exactly should i put the downloaded model, right under hub path or a folder beneath it ?
I am currently working on an update that let users configure a custom path for manualy downloaded model files.
The models are stored in .cache\huggingface\hub\ in the users home directory (on Windows). The problem is most probably the transformers package, it must be >=4.36.0.
can the model path be reconfigured in any file? if not, where exactly should i put the downloaded model, right under hub path or a folder beneath it ?
I am currently working on an update that let users configure a custom path for manualy downloaded model files.
Can be tested using the custom_model_path branch.