lucidrains / En-transformer

Implementation of E(n)-Transformer, which incorporates attention mechanisms into Welling's E(n)-Equivariant Graph Neural Network
MIT License
208 stars 28 forks source link

question about the data format #14

Open llllly26 opened 7 months ago

llllly26 commented 7 months ago

Hi @lucidrains,

I found that the input of the En-Transformer is different from EGNN in your released repository that EGNN using torch_geometric lib to deal with the node and edge. For instance, EGNN-pytorch encode node: [bs_node_number_sum, dim], and edge: [2, bs_edge_number_sum]. So I want to know how to put this type data into En-Transformer?

thanks!