vsomnath / graphretro

Learning Graph Models for Retrosynthesis Prediction (NeurIPS 2021)
https://arxiv.org/abs/2006.07038
MIT License
46 stars 14 forks source link

missing functions in atom attention layer #7

Closed ANugmanova closed 1 year ago

ANugmanova commented 1 year ago

Good afternoon, I would like to try using the attention layer, but I noticed that the implementation of several functions in AtomAttention module is missing, such as create_scope_tensor, flat_to_batch, get_pair. Could you please add it or tell a little more about what they should do?

vsomnath commented 1 year ago

Hi, thank you for the question!

I think I deleted them while cleaning up since the functions were largely never used outside of this, and the final model didn't use any attention. I will look for them in the private version if they are still there!

ANugmanova commented 1 year ago

Thanks a lot!

vsomnath commented 1 year ago

Hi, sorry, I could not find them in the private version either. Maybe this just carry forwarded over from stuff before.