-
Hi there,
I need to know about the way the context is formatted for the training of blenderbotSmall. For instance, when you want to incorporate knowledge or persona plus the n past utterances as t…
Lkh97 updated
2 years ago
-
Hello!
First of all, thanks for this wonderful collection of pretrained models. I wonder what is the domain of the corpora used for pretraining BERTurk, DistilBERTurk, ConvBERTurk, and ElecTRa. I w…
-
Looking to take advantage of the wonder work y'all have done.
In regards to creating a new text to IPA encoder. Does the existing model embedding have place holders for the full IPA character setup…
-
Please make sure that this is a feature request. As per our [GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md), we only address code/doc bugs, performance issues, feature …
-
Is it feasible to train the vocoder with a batch size of 6? I have a laptop with an 8 GB Ram GPU. Batch size 6 currently shows about 7GB RAM in use.
-
Hey All,
This is a temporary issue to discuss the idea of training a Bert model specific for Arabic - **AraBert**. We will move to another repository once we have a clear understanding of the prob…
-
In this issue, I will maintain a few ML and NLP research papers that I have identified to be worth summarizing. Read more about this initiative here #23.
If you are interested in working together o…
-
Hi, thanks for sharing this great work!
I was wondering what split you used when getting results for Table 3? I noticed that "random split" is set as the default in the evaluation script, which is…
-
Hi,
I'm curious, do you happen to know if there's a way to use a pre-trained model from Huggingface models and use that to initialize KEPLER training? I was hoping to initialize KEPLER with a RoBERT…
-
https://github.com/Alibaba-MIIL/ImageNet21K/blob/00ef9989825bbcb8dedc91eac18638c129eb5ad8/src_files/models/utils/factory.py#L54
Why do we need to load pretrained ImageNet-1K model if we are going t…