-
**Feature request**
I'm having trouble exporting the Roberta model from the fairseq library to Tensorflow.
Below is some example code of how I am currently loading Roberta and my initial attempt to …
-
Research and find a suitable base language model.
_Examples:_
- FinBert
- Roberta
- Llama 3.1
- Phi
**Resources to check:**
https://ollama.com/blog
-
Tried to load a finetuned roberta model with
```
import mii
pipe = mii.pipeline("roberta fine-tuned model path")
```
But it shows error
```
ValueError: Unsupported model type roberta
```
-
xlm_roberta_text2natsql_schema_item_classifier 为什么不能使用 报错 OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found 怎样解决
-
I noticed that it is coming soon in the README. Any plan to update the configurations?
-
a pulseira extra já ta com os open gl certo até pq vc me mandou funcionando pra gente ver o tempo de carregamento e tudo mais...
só que no site não vi funcionando, tem um arquivo lá mas não funciona.…
-
## **What are you trying to do?**
Open Roberta Lab display values
## **What did you expect to happen?**
Able to quit program normally
## **What actually happened?**
Ev3 is stuck in the progra…
-
changing from roberta-base to bert-base is throwing me a forward function error, I was wondering if you test with a different base model and what is so specific to roberta in your model.
-
Code breaks using a different model other than BERT. I debugged into the code and found that the code is written with respect to BERT tokenizer only while the tokenizers of other transformer models ar…
-
I'm having trouble accessing the model for the inference and evaluation.
"OSError: CARDS_RoBERTa_Classifier is not a local folder and is not a valid model identifier listed on 'https://huggingface.…