yao8839836 / kg-bert

KG-BERT: BERT for Knowledge Graph Completion
Apache License 2.0
689 stars 140 forks source link

Using `.add_tokens` #13

Open MichaelHirn opened 4 years ago

MichaelHirn commented 4 years ago

If I'm correct you are using the description of an entity (or relationship) and tokenize that description. The entities and relationships do not have their own tokens, right?

Did you try to learn an embedding for an entity/relationship or does that not really make any sense?

yao8839836 commented 4 years ago

@MichaelHirn

Yes, I am only using tokens in the description of an entity or a relation.

The embedding for an entity/relationship can be the average of their description token embeddings (hidden state of BERT), but in the knowledge graph completion tasks, the embeddings are not necessary.

MichaelHirn commented 4 years ago

Ahh I see, thanks for your comment.

I'm wondering if you have tried creating a new token (optionally initialising the token with a related embedding or even with the average of the token embeddings) and learning an embedding for that new token. I'm trying to adapt your approach for our use-case where using the description of an entity does not work and and am curious if you have tried that approach and if so how it worked out.

yao8839836 commented 4 years ago

@MichaelHirn

I didn't try this.