thunlp / PELT

Source code for "A Simple but Effective Pluggable Entity Lookup Table for Pre-trained Language Models"
43 stars 8 forks source link

Some question about infuse information to entity embedding. #7

Closed BAOOOOOM closed 2 years ago

BAOOOOOM commented 2 years ago

Hello, I have read your paper and your code, I think it is a meanful work which can be used in many field to improve performance. I want to cite your method to my work, but I cannot understand how to use enhanced entity embedding in downstream. I have tried to use PLM1 to get context's word embedding, then use new embedding to replace entity's embedding and send it to PLM2. The result is terrible, can you help me with this question? Thank you very much!

YeDeming commented 2 years ago

We generate the embedding from PLM1 and then appy it on PLM1 to extend its vocabulary. Hence the embeding doesn't support PLM2.

BAOOOOOM commented 2 years ago

We generate the embedding from PLM1 and then appy it on PLM1 to extend its vocabulary. Hence the embeding doesn't support PLM2.

Thanks!