-
These models are available in live, backtesting & research in the cloud environment.
Access installed models and their revisions
```python
from huggingface_hub import scan_cache_dir
…
-
Hi, I'm curious about the training data of xlm-r models finetuned on conll ner datasets (e.g. xlm-roberta-large-finetuned-conll03-german, xlm-roberta-large-finetuned-conll03-english), are the models…
-
在代码中如下面路径的文件,请问这些文件需要到哪里下载呀,这个没有找到呢
self.dict_path = 'E:/bert_weight_files/roberta/vocab.txt' self.config_path='E:/bert_weight_files/roberta/bert_config_rbt3.json' self.checkpoint_pat…
-
Hi, I think the released LayoutLM v1 model in huggingface is initialized with BERT weights. I do see in the paper it states that initializing with roberta weights gives better results. IS there a reas…
-
### Describe the issue
Greetings,
Are there any plans on releasing instructions or at least the dataset format so we can fine-tune the `llmlingua-2-xlm-roberta-large-meetingbank` or the base `xlm-…
-
when I use roberta_zh to pretrain CPT model , raise error "Error(s) in loading state_dict for BertModel". So what pretrained model shoud i use? Roberta or BERT ?
-
I would like to evaluate the toxicity of the generations of my fine tuned models. I'm interested in using toxigen which seems popular (eg used in the Llama 2 paper).
However, looking at the curren…
-
# ❓ Questions and Help
Hi all, I'm trying to load a pretrained XLM-Roberta model from HuggingFace using xformers to examine the potential speed up. To the best of my abilities, I've defined a config …
-
I have been using `BertPreTrainedModel` to load this roberta model, which works well.
Noticing in `pytorch_transformers`, `Roberta` is also supported.
```
from pytorch_transformers import (BertCo…
Vimos updated
5 years ago
-
Hi,
I'm having issues running the script of RoBERTa (for the US dataset)
I ran this line
`!python run_language_modeling.py --output_dir=output_roberta_US --model_type=roberta --model_name_or…