Closed fridayL closed 3 years ago
Yes. Currently, we only tried randomly initialized embedding, and the phases of the complex embedding are initialized exactly in [-\pi, +\pi]. For your case, I guess only initializing the modulus as BERT embedding and randomly initializing the phase would work better.
Hi, thanks a lot for your work in KGE, but I still am confused about init embedding, I tried to init embedding from bert through entity text information, but when train model, the neg triplets loss seem still upgrade, I have change LR or other hyperparameters, but not work, Will different init embedding have different results to model?