huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
133.99k stars 26.79k forks source link

about code #34338

Open lanlan990802 opened 3 days ago

lanlan990802 commented 3 days ago

System Info

Could you please advise me on how to resolve this issue after importing the Transformer library from Hugging Face? ValueError: TimeSeriesTransformerForPrediction does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/Maple728/TimeMoE-50M/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new

Who can help?

No response

Information

Tasks

Reproduction

I got the code from https://huggingface.co/Maple728/TimeMoE-50M/tree/main

Expected behavior

maybe I got a mistake from the wrong way, I immediately download from the browser.

tomcotter7 commented 2 days ago

What transformers version are you using?

LysandreJik commented 1 day ago

@lanlan990802 this seems like an error with the code hosted on the repo you linked. Have you considered opening an issue there directly?

https://huggingface.co/Maple728/TimeMoE-50M/discussions