zjukg / KGTransformer

[Paper][WWW2023] Structure Pre-training and Prompt Tuning for Knowledge Graph Transfer
https://arxiv.org/pdf/2303.03922.pdf
46 stars 5 forks source link

Question: Where is the part of code let pretrain parameters attend to downstraming tasks #3

Closed WEIYanbin1999 closed 1 year ago

WEIYanbin1999 commented 1 year ago

While I got the pretrain model, they are stored under the BIG/ dictionary. Like this: model_layer-4_hidden-768_heads-12_seq-126_textE-cls_t0-1.0_t1-1.0_t2-1.0.ep9

However, while I run the downstreaming task code, it always mention: logger : INFO cannot load pretrained parameters.

I found there are segment of code in downstraming task python script: image

While running code in try: the pretrain_save_path is empty. I try to modify it as model_layer-4_hidden-768_heads-12_seq-126_textE-cls_t0-1.0_t1-1.0_t2-1.0.ep6,7,8,9, but the parameter size mismatch.

So I have a question: where exactly the code utilize and load the parameters of pretrained encoder?

WEIYanbin1999 commented 1 year ago

The problem has solved by: image However, I don't know why you padding '_delWE' to the pretrain parameter path

YushanZhu commented 1 year ago

Sorry for the late reply. Your way of dealing with it is okay. In fact, we also provide a file "get_pretrained_KGTransformer_parameters.py", running it can obtain the model (suffix is "_delWE") that can be loaded. Hope it will help you.