microsoft / Graphormer

Graphormer is a general-purpose deep learning backbone for molecular modeling.
MIT License
2.08k stars 334 forks source link

On the example usage of graphformer encoder #2

Closed tonytan48 closed 3 years ago

tonytan48 commented 3 years ago

Hi, thank you for your exciting work on graphformer. I am curious in understanding the mechanisims for this model. I tried to declare the example Encoder layer. I commented out the data import lines.

It seems the Multihead Attention is not imported and I am not sure whether this MHA module under graphformer is customized or not. I am mostly curious on the implementation of spatial encoding part.

May I know is it possible for you to provide a toy example? May be a forward pass for a random 10x10 node matrix will do.

sapphire008 commented 3 years ago

Take a look at the ogb-lsc branch.

yxw25 commented 3 years ago

看看ogb-lsc分行。

Multiheadattention is not seen in 'ogb-lsc'. Can you point out which file it is in? thank you.

zhengsx commented 3 years ago

Hi, thank you for your exciting work on graphformer. I am curious in understanding the mechanisims for this model. I tried to declare the example Encoder layer. I commented out the data import lines.

It seems the Multihead Attention is not imported and I am not sure whether this MHA module under graphformer is customized or not. I am mostly curious on the implementation of spatial encoding part.

May I know is it possible for you to provide a toy example? May be a forward pass for a random 10x10 node matrix will do.

Please check the latest commit for example usage.