alibaba / GraphTranslator

GraphTranslator:Aligning Graph Model to Large Language Model for Open-ended Tasks
BSD 3-Clause "New" or "Revised" License
68 stars 12 forks source link

Several questions regarding the paper #11

Open JunyiChE opened 3 months ago

JunyiChE commented 3 months ago

Dear authors,

First of all, very inspiring and novel work to the Graph LLM community. Yet, I have several questions regarding the paper details, especially the first stage of model training, which I hope can be clarified.

  1. What is the [DEC] token in this case. Does it represent the overall token sequence of the textual input? or it is just a self-designed token at the start of the sequence to signal the decoding task?
  2. The third objective in the first-stage training is to compute whether the query representation matches the textual representation through classifier, yet how do you define the two representations match or not? Does the output score the classifier indeed reflect the golden truth of whether they match?

I also try to find the answers of above questions via the original paper of 'BLIP', but they neither provide a clear illustration in their paper.

It would be greatly appreciated if you can help me understand these questions. Thank you!