-
## Description
Provides Roberta parameters pre-trained by large-scale Chinese corpus.
I plan to convert it from https://github.com/brightmart/roberta_zh
## References
- https://github.com/brightma…
-
Hi, I have observed there were many tweets which were nor downloaded due to the tweets being deleted. Also, for some strange reason for some tasks the number of sentences were higher than the number r…
-
https://arxiv.org/pdf/1704.05021.pdf
用topk跑一个nmt就能中emnlp?~~用topk跑一个finetune bert是不是能中今年acl?~~
-
It would be nice to have the model also hosted on huggingface (https://huggingface.co/models), so people could use it from the huggingface API without manually downloading the model dump.
-
Hi Xiujun,
Thanks for your work and source code. I came across this error when training, could you please tell me where I can find this file?
Cheers
-
Few people were mentioning papers with highly chats on Twitter during ACL2020. (see [this](https://twitter.com/srush_nlp/status/1280940612146081794)). I think it might be a good idea to add such a sec…
-
**Issue by [zphang](https://github.com/zphang)**
_Friday Apr 10, 2020 at 21:37 GMT_
_Originally opened as https://github.com/nyu-mll/jiant/pull/1059_
----
Performance comparison on a set of represe…
-
* https://arxiv.org/abs/2006.03511
* 2020
トランスコンパイラは、ソース・ツー・ソース・トランスレータとも呼ばれ、高レベルのプログラミング言語(C++やPythonなど)のソースコードを別の言語に変換するシステムです。
トランスコンパイラは主に相互運用性のために使用され、時代遅れの言語(COBOL、Python 2など)で書かれたコードベースを最新…
e4exp updated
3 years ago
-
In the paper, you said that in the first refinement of LSR, Word War II interacts with several local mentions with higher attention scores, e.g., 0.43 for the mention Lake Force. But in the Figure 5, …
-
Hi,
Thanks for sharing your work. Looking forward to your code releasing. (/≧▽≦)/ Do you have an estimated time?
Thanks