SKT-AI / KoGPT2

Korean GPT-2 pretrained cased (KoGPT2)
Other
532 stars 98 forks source link

Fix typo #20

Closed Kyeongpil closed 4 years ago

Kyeongpil commented 4 years ago

As described in gpt2_345m_hparams, I think the number of self-attention head (line 370) should be 16, not 24.

haven-jeon commented 4 years ago

I agree!