-
I want to deploy MaCBert but I cannot find some helpful blogs, do you have instructions about deployment of this model?
-
Hi guys,
I am following the Megatron-LM example to pre-train a BERT model but I'm getting this error:
```
[rank0]: Traceback (most recent call last):
[rank0]: File "/root/Megatron-LM/pretrai…
-
https://yangjx29.github.io/posts/1bb262dc.html
BERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingPaper | Talk
Part1. 标题&作者Pre-trai
-
Intel MLPerf inference runs are failing for R50 and bert as shown [here](https://github.com/GATEOverflow/cm4mlops/actions/runs/11829661024/job/32961852185)
-
Hi there. Thanks for the great library!
I have one issue regarding the usage of Bert-based models. I trained different models finetuning them on my custom dataset (roberta, luke, deberta, xlm-rober…
-
老师您好,
请问readme中提供的pytorch.bin是google bert权重还是学习后的权重?
-
Suppose we have a BERT NLI model trained to zero-shot classify texts given a prompt such as, for example, 'This text relates to cars.' Is it possible to use DSPy to optimise the prompt for such a mode…
-
Hi, I'm trying to inference motion bert in the wild. I am having issues with the JSON format that I should use for this.
Below is the json format i am currently using. I can't find an example of a…
-
Thanks for your excellent work, could you please tell me how to get scannet_607_bert-base-uncased_id.pth for pre-train? BTW, when will you release all pre-trained checkpoints?
-
Thanks to @fierceX we now have an fine-tuning example of BERT for SQuAD 1.1. https://github.com/dmlc/gluon-nlp/pull/493
Followup works:
- [ ] Tutorial for fine-tuning on SQuAD 1.1. The QA dataset…