Closed tonytan48 closed 3 years ago
Take a look at the ogb-lsc
branch.
看看
ogb-lsc
分行。
Multiheadattention is not seen in 'ogb-lsc'. Can you point out which file it is in? thank you.
Hi, thank you for your exciting work on graphformer. I am curious in understanding the mechanisims for this model. I tried to declare the example Encoder layer. I commented out the data import lines.
It seems the Multihead Attention is not imported and I am not sure whether this MHA module under graphformer is customized or not. I am mostly curious on the implementation of spatial encoding part.
May I know is it possible for you to provide a toy example? May be a forward pass for a random 10x10 node matrix will do.
Please check the latest commit for example usage.
Hi, thank you for your exciting work on graphformer. I am curious in understanding the mechanisims for this model. I tried to declare the example Encoder layer. I commented out the data import lines.
It seems the Multihead Attention is not imported and I am not sure whether this MHA module under graphformer is customized or not. I am mostly curious on the implementation of spatial encoding part.
May I know is it possible for you to provide a toy example? May be a forward pass for a random 10x10 node matrix will do.