DeepGraphLearning / KnowledgeGraphEmbedding

MIT License
1.27k stars 267 forks source link

Init embedding random OR by bert encode from entity description text #50

Closed fridayL closed 3 years ago

fridayL commented 3 years ago

Hi, thanks a lot for your work in KGE, but I still am confused about init embedding, I tried to init embedding from bert through entity text information, but when train model, the neg triplets loss seem still upgrade, I have change LR or other hyperparameters, but not work, Will different init embedding have different results to model?

Edward-Sun commented 3 years ago

Yes. Currently, we only tried randomly initialized embedding, and the phases of the complex embedding are initialized exactly in [-\pi, +\pi]. For your case, I guess only initializing the modulus as BERT embedding and randomly initializing the phase would work better.