Closed GabrielLin closed 4 years ago
@LuoweiZhou OK, thanks for your reply.
I saw that project. It seems that they have provided the link. What's different between theirs and yours?
Pre-trained Models We release a large cased UniLM model pre-trained with Wikipedia and BookCorpus corpora. The model is trained by using the same model configuration and WordPiece vocabulary as BERT. The model parameters can be loaded as in the fine-tuning code.
The links to the pre-trained models:
UniLMv1-large-cased: 24-layer, 1024-hidden, 16-heads, 340M parameters
Ours is base cased, rather than large cased.
Can I use the large cased model in this project?
UniLM checkpoint
wget -O bert_save.tar.gz "https://onedrive.live.com/download?cid=E5364FD183A1F5BB&resid=E5364FD183A1F5BB%212016&authkey=AB5-lxzCkgpfLhg"
The above link is no longer anymore. Thanks.