zjukg / KGTransformer

[Paper][WWW2023] Structure Pre-training and Prompt Tuning for Knowledge Graph Transfer
https://arxiv.org/pdf/2303.03922.pdf
46 stars 5 forks source link

What are the [M] in the prompt? In figure 3. the illustration for the Zer-shot classification and QA task? #1

Closed WEIYanbin1999 closed 1 year ago

wencolani commented 1 year ago

It refers to the mask token, the same as the mask token used in subgraph Pre-training.

WEIYanbin1999 commented 1 year ago

Thanks for you answer. Besides, I try to run the pre-train code: "python run_pretrain.py --pretrain_dataset BIG --dataset_name BIG --num_hidden_layers 4 --train_bs 16 --lr 1e-4 --epochs 10" but the process frozen after "The two_hop_triple_path not exists, generate and dump it":

image could you help me?

YushanZhu commented 1 year ago

When first run, it needs to generate and save subgraphs file, this process may take a few more moments to complete.