-
Hi Matthias,
is it possible to have a minimum example of how to re-train the model for other languages?
How does the training data look like? What are the "labels"? Could you please explain the ma…
-
While fine-tuning the transformers model i.e.```transformers.TFDistilBertModel.from_pretrained(pretrained_weights)```
I got this error message.
![image](https://user-images.githubusercontent.com/538…
-
Currently, we use Cosine Similarity for similarity metric. With complex architectures like BERT, it may not be effective as the objective functions used for pre-training or fine-tuning does not direct…
-
Algumas medidas podem ser tomadas para obter um melhor modelo, dentre as quais otimização de parâretros _(fine tuning_) e/ou testar outro modelo.
Em testes iniciais, consegui obter um _weighted f1-…
-
Hello!
We are Korean students.
We would like to implement a Korean slang filtering system as your BERT model.
A test is in progress by fine-tuning the CoLA task on run_classifier.py from the exis…
-
Hi,
Thanks for the great work! I have been using the multilingual sentence-bert pretrained model and fine-tuning on QAs from conversations with the multipleNegativeRankingLoss for an answer sugges…
-
I have many experiments in mind where I need to condition a Transformer Decoder with some input (e.g. image features, discrete binary labels, a one-hot representing some concept, a question, etc.) in …
-
Is there any documentation or `examples` that I can refer to train a transformer model from scratch using `fairseq2`? The `examples` folder in the repository seems empty.
-
I have trained the NER model on sciie dataset using the following config:
```
DATASET='sciie'
TASK='ner'
with_finetuning='_finetune' #'_finetune' # or '' for not fine tuning
dataset_size=38124…
-
I have a question about the implementation of Part-of-Speech tagging.
The following command will tag the POS.
`python3 pos_tagging.py --do_train --do_tagging train --gpus 0 1 --dataset_folder wiki…