yzhangcs / parser

:rocket: State-of-the-art parsers for natural language.
https://parser.yzhang.site/
MIT License
825 stars 138 forks source link

Missing word_embed with "bert" encoder #138

Closed davebulaval closed 5 months ago

davebulaval commented 6 months ago

Hi,

In this part of the code https://github.com/yzhangcs/parser/blob/bebdd350e034c517cd5b71185e056503290164fa/supar/model.py#L93C2-L116C60, when you set "bert" as the encoder, it does not init the word_embed attribute. Thus, it will fail at the load_pre_trained. And I cannot find a way to skip this.

davebulaval commented 6 months ago

The problem is the breaking change with the change of n_out in the TransformerEmbedding from n_feat_embed to n_plm_embed. Thus, the pre-trained model does not set up this value correctly and the n_plm_embed = 0, thus the projection layer is (768, 768) instead of (768, n_feat_embed)

github-actions[bot] commented 5 months ago

This issue is stale because it has been open for 30 days with no activity.

github-actions[bot] commented 5 months ago

This issue was closed because it has been inactive for 7 days since being marked as stale.