-
Hello,
The outputs of both `run_prompt_finetune.py` and `run_prompt_finetune_test.py` showed that the models always predicted positive labels. I tried both BERT and RoBERTa as the PLM.
There's …
-
Можете рассказать насколько затратно дообучать BERT, что для этого надо (характеристики железа), насколько долго и т. д.
-
Have you tried experimenting with lower parameter models like flan t5, albert, bert etc or even qwen 0.5b?
With fine tuning they might be able suffice in this specific domain?
I have a low end machi…
-
Firstly, thanks for providing such an amazing library.
**Background:**
I have seen a lot of interest in multilingual cross encoders and although [amberoad model](https://huggingface.co/amberoad/…
-
您好,请问可以提供一下bert_fine_tuning.py文件吗?只看到了bert_fine_tuning.ipynb文件
-
I'm fine-tuning my model (neuralmind/bert-large-portuguese-cased) for STS task using assin dataset, which contains pairs in Portuguese for RTE and STS tasks, I'm facing a problem, when I try 1 epoch, …
-
I want to fine tune bert on my custom dataset but i don't know how to tag sentences and do fine tuning if anyone knows please suggest me.
-
The link to the fine-tuning guide on the Hugging Face model card for "temporal_tagger_BERT_tokenclassifier" appears to be unavailable: https://huggingface.co/satyaalmasian/temporal_tagger_BERT_tokencl…
-
### Feature request
Seems that there is no config for DeBERTa v1-2-3 as decoder (while there are configs for BERT/RoBERTa et similia models)... This is needed in order to perform TSDAE unsupervised…
-
Hello,
any plan to use BERT models (or BERT-like models) for NLP sequence-to-sequence and/or classification tasks?
Best regards