Open lanlan990802 opened 3 days ago
Could you please advise me on how to resolve this issue after importing the Transformer library from Hugging Face? ValueError: TimeSeriesTransformerForPrediction does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/Maple728/TimeMoE-50M/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new
No response
examples
I got the code from https://huggingface.co/Maple728/TimeMoE-50M/tree/main
maybe I got a mistake from the wrong way, I immediately download from the browser.
What transformers version are you using?
@lanlan990802 this seems like an error with the code hosted on the repo you linked. Have you considered opening an issue there directly?
https://huggingface.co/Maple728/TimeMoE-50M/discussions
System Info
Could you please advise me on how to resolve this issue after importing the Transformer library from Hugging Face? ValueError: TimeSeriesTransformerForPrediction does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/Maple728/TimeMoE-50M/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I got the code from https://huggingface.co/Maple728/TimeMoE-50M/tree/main
Expected behavior
maybe I got a mistake from the wrong way, I immediately download from the browser.