RichardHGL / WSDM2021_NSM

Improving Multi-hop Knowledge Base Question Answering by Learning Intermediate Supervision Signals. WSDM 2021.
130 stars 22 forks source link

The number of reasoning steps #2

Closed dinani65 closed 3 years ago

dinani65 commented 3 years ago

could u please explain what K is in the paper? It is mentioned as the k-th reasoning step. how many reasoning steps are there? Is it a hyper-parameter? It depends on the number of hops of each question or the length of query(the number of hidden states of LSTM) or something else? may it be 2 (forward reasoning and backward reasoning )? Also, I greatly appreciate it if u clarify how the answer is selected in a simple description. The student part is using the supervision signals to find the answer. It applies the forward reasoning and tries to reduce the loss function and then?

RichardHGL commented 3 years ago

Thanks for your attention!

The reasoning step is a hyper-parameter, we set it by --num_step <number>. It controls how many steps the instruction component and reasoning component will repeat to conduct multi-hop reasoning. It should be set differently according to datasets.

The answer selection process is: (1) we obtain a probability distribution for all entities in question-specific graph (2) we rank these entities with probability (3) From top ranked entities, we select them as answers until the accumulated probability exceed predefined limit (hyperparameter) --eps <float number between 0 to 1>. You can find this process in function f1_and_hits_new of NSM/train/evaluate_nsm.py.

dinani65 commented 3 years ago

Thanks for your explanation. So the goal of the teacher component is to generate supervision signals so that the maximum probability of distribution matrix is related to the answer entity (in the forward reasoning). right?

RichardHGL commented 3 years ago

Yes