LHRLAB / NQE

[AAAI 2023] Official resources of "NQE: N-ary Query Embedding for Complex Query Answering over Hyper-relational Knowledge Graphs".
https://doi.org/10.1609/aaai.v37i4.25576
MIT License
22 stars 1 forks source link

Cannot evaluate inp query #2

Open slnxyr opened 1 year ago

slnxyr commented 1 year ago

Hi, I follow your guide to train and evaluate the model by 1p-train, but there is something wrong when evaluate inp query. Could you help me? Thanks. Beside, how long does it take to train all queries? image

LHRLAB commented 1 year ago

It's strange that you may have a problem with the version of the package.

slnxyr commented 1 year ago

But the other queries can get the evaluation result in the same code.

LHRLAB commented 1 year ago

That may be the format problem of the inp data set. We will check it, and you can try to download the data set again.

YiZuMonash commented 1 year ago

Same question as @slnxyr , I ran the code based on the full queries, but it seems to take a long time. May I know how long will it takes because it still in epoch 1 for nearly 1 day which confused me a lot. my settings as follows. Thanks.

parser.add_argument("--train_tasks", default="1p,2p,3p,2i,3i,pi,ip,2u,up,2cp,3cp,2in,3in,inp,pin,pni", type=str)

LHRLAB commented 1 year ago

Yes, full-query training does take a long time, but because NQE is based on Transformer, you can achieve a similar effect of full-query training with only 1p training.