Louis-udm / VGCN-BERT

MIT License
122 stars 35 forks source link

Ask the author to answer,thank you! #12

Open dongcy-AHU opened 3 years ago

dongcy-AHU commented 3 years ago

Hello, may I ask, how is the [CLS] token in multi-layer self-attention initialized? What is its vector dimension?

Louis-udm commented 3 years ago

Hi,

Thank you for your attention. It's just a torch.nn.Embedding, 768 dimension, the initialization function is torch.init.normal_ Just look at the source and do a little search.