Closed jzbjyb closed 3 years ago
Hi:) Thanks for reporting this issue.
Actually KoBERT
isn't originally owned by me. (Original repo -> SKTBrain KoBERT) This one is a ported version for transformers
library.
I've also noticed this issue when I first made this ported version. So I've made monologg/kobert-lm
, which is retrained with my own corpus for 1 epoch, with the initialized parameter of monologg/kobert
Is it possible to upload the pretrained model with the missing parameters (either in huggingface's
transformers
or providing a link to the original tf checkpoint)?
I'll check this one with original KoBERT owner:)
I see. Thanks for the clarification! I will use monologg/kobert-lm
for now and check later if the original KoBERT owner can fix this issue.
Thanks for releasing the KoBERT model! However, I found that the parameters of
BertOnlyMLMHead
layer might be missing in themonologg/kobert
model, which I think is a common issue that I also found in released BERT models for others languages, like Greek and Russian. To reproduce this issue:Is it possible to upload the pretrained model with the missing parameters (either in huggingface's
transformers
or providing a link to the original tf checkpoint)?