nju-websoft / muKG

μKG: A Library for Multi-source Knowledge Graph Embeddings and Applications, ISWC 2022
GNU General Public License v3.0
109 stars 11 forks source link

About the KGQA experiments in the ISWC paper #7

Closed AOZMH closed 1 year ago

AOZMH commented 1 year ago

Hi,

Thanks for the charming project, that inspires a lot!

One thing I'm a bit curious about is the detail of your experiments on multi-KG-embedding for KGQA in the ISWC 22' paper, in which you align Freebase (for QA) with Wikidata to enhance embedding quality. For me the embedding on of a KB and the alignment (prediction) between two KBs are two separate tasks, I wonder how did you train an embedding model that embeds FB & Wikidata, while also learning the alignments between them to improve embedding?

This issue might be addressed if you could kindly provide the detailed settings in the KGQA section (e.g. the embedding approach you used / the hyperparam settings...). Hope to get your reply & please comment had I didn't made it clear, thx!

sunzequn commented 1 year ago

Hi,

Thanks for your interest in our work.

Given Freebase and Wikidata triples, as well as their entity alignment (which we can obtain from Linked Open Data), we merge the two KGs into a joint graph by replacing some Wikidata entities with their counterparts in Freebase. Then, we train ComplEx embeddings for the joint graph, and use the Freebase embeddings for multi-hop KGQA.

The hyper-parameter settings of ComplEx on this joint graph are the same as those on FB15K. 

AOZMH commented 1 year ago

Cool, thanks for that!