-
Hello,
I want to run:
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("NVEagle/Eagle-X5-13B-Chat")
But I get:
ValueError: The checkpoint you are tr…
flehn updated
1 month ago
-
Hi, friend, maybe you can try to use the llama architecture instead of the original Transformer?(You can refer to llama architecture in llama2.c)
-
that is prefers simplicity over tricks, hackability over tedious organization, and interpretability over generality.
https://github.com/karttikeya/minREV?tab=readme-ov-file
-
> Not for this PR, but WDYT about putting these Hub filters as a badge at the top of each trainer doc, similar to how `transformers` does it for architectures: https://huggingface.co/docs/transformers…
-
any plan to support this type of arch? i.e. vit, swin!
innat updated
11 months ago
-
I am able to use both the Florence-2-base-PromptGen-v15 and Florence-2-large-PromptGen-v15 models with your Tagger node, but when I try to use either of them with the Florence2Run node, I get the foll…
-
### Feature request
Recently, we have added the ability to load `gguf` files within [transformers](https://huggingface.co/docs/hub/en/gguf).
The goal was to offer the possibility to users …
-
I just follw the step, but when I run the following code :
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("Efficient-Large-Model/Llama-3-VILA1.5-8B")
…
-
Dear Xiaoyuan Zhang,
I am very interested in your project. This year, my research group published a paper titled "A Hyper-Transformer Model for Controllable Pareto Front Learning with Split Feasibi…
-
HI, I GET THIS ERROR WHEN I WANT TO USE Florence 2 large promptgen
**The checkpoint you are trying to load has model type `florence2` but Transformers does not recognize this architecture. This could…