-
Можете рассказать насколько затратно дообучать BERT, что для этого надо (характеристики железа), насколько долго и т. д.
-
Firstly, thanks for providing such an amazing library.
**Background:**
I have seen a lot of interest in multilingual cross encoders and although [amberoad model](https://huggingface.co/amberoad/…
-
您好,请问可以提供一下bert_fine_tuning.py文件吗?只看到了bert_fine_tuning.ipynb文件
-
I would like to propose a new method of analyzing data over the IMDB dataset. BERT (Bidirectional Encoder Representations from Transformers) for analyzing the IMDb movie review dataset. BERT has revol…
-
I'm fine-tuning my model (neuralmind/bert-large-portuguese-cased) for STS task using assin dataset, which contains pairs in Portuguese for RTE and STS tasks, I'm facing a problem, when I try 1 epoch, …
-
The link to the fine-tuning guide on the Hugging Face model card for "temporal_tagger_BERT_tokenclassifier" appears to be unavailable: https://huggingface.co/satyaalmasian/temporal_tagger_BERT_tokencl…
-
### Feature request
Seems that there is no config for DeBERTa v1-2-3 as decoder (while there are configs for BERT/RoBERTa et similia models)... This is needed in order to perform TSDAE unsupervised…
-
Hello,
any plan to use BERT models (or BERT-like models) for NLP sequence-to-sequence and/or classification tasks?
Best regards
-
Hi,
I found in some blogs that fine-tuning a BERT model is better than extracting features from bert without fine-tuning and then train a neural network from scratch and they justify this by the f…
-
I just fine-tuning bert with a classification task, and i noticed that a classifier is append after bert's output when fine-tuning.
#### create_model(...) in run_classifier.py
```python
...
with …