AIFSH / OmniGen-ComfyUI

MIT License
159 stars 19 forks source link

[Resolved] Error (Windows) #1

Open 0X-JonMichaelGalindo opened 3 weeks ago

0X-JonMichaelGalindo commented 3 weeks ago

image

Error: Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please request the support for this architecture: https://github.com/huggingface/transformers/issues/28005. If you believe this error is a bug, please open an issue in Transformers GitHub repository and load your model with the argument attn_implementation="eager" meanwhile. Example: model = AutoModel.from_pretrained("openai/whisper-tiny", attn_implementation="eager")

Updating dependencies did not help. Please let me know if there is any more information I can provide, and thank you for this work.

xiao-ning-ning commented 3 weeks ago

install ↓ torch==2.3.1 transformers==4.45.2 datasets==2.20.0 accelerate==0.26.1 jupyter==1.0.0 numpy==1.26.3 pillow==10.2.0 torch==2.3.1 peft==0.9.0 diffusers==0.30.3 timm==0.9.16 you can fix it

huafitwjb commented 3 weeks ago

请问该如何安装这些呢

0X-JonMichaelGalindo commented 3 weeks ago

install ↓ torch==2.3.1 transformers==4.45.2 datasets==2.20.0 accelerate==0.26.1 jupyter==1.0.0 numpy==1.26.3 pillow==10.2.0 torch==2.3.1 peft==0.9.0 diffusers==0.30.3 timm==0.9.16 you can fix it

This cannot be right. Genmo Mochi requires torch 2.5 to run, so following your instructions would break my Comfy installation. I have OmniGen running on my PC using Pinokio. The Pinokio environment is running OmniGen with Torch 2.5.0+cu124. image

0X-JonMichaelGalindo commented 2 weeks ago

Success! Updating transformers to 4.45.2 resolved my issue.