zjunlp / KnowPrompt

[WWW 2022] KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction
MIT License
194 stars 34 forks source link

How relation embedding was constructed in the algorithm? #32

Closed yccckid closed 1 month ago

yccckid commented 1 month ago

I want to know how the relation embedding was constructed? I learned from this repo where relation embedding was built from the bert 【mask】? But this is the same to type word embedding. Am I right?

njcx-ai commented 1 month ago

Thank you for your interest in our work. To clarify, the relation embeddings are constructed at the vocabulary level of BERT. We perform relation classification by calculating the similarity between the model's output at the [MASK] position and the relation embeddings. I hope this explanation addresses your question.

zxlzr commented 1 month ago

hi buddy, do you have any further questions?

yccckid commented 3 weeks ago

Thank you for your kindness. I am still confused by the construction of the virtual answer words. one specific question is needed to be clarified. (1) In the paper, you guys mentioned the MLM head was expanded with the relation embedding head (see Fig. 2), and there is a dot-product, right? So I want to know that how this process is conducted by your code? What is the relation embedding head, and where does it come from? Thank you guys!