Closed ANugmanova closed 1 year ago
Hi, thank you for the question!
I think I deleted them while cleaning up since the functions were largely never used outside of this, and the final model didn't use any attention. I will look for them in the private version if they are still there!
Thanks a lot!
Hi, sorry, I could not find them in the private version either. Maybe this just carry forwarded over from stuff before.
Good afternoon, I would like to try using the attention layer, but I noticed that the implementation of several functions in AtomAttention module is missing, such as create_scope_tensor, flat_to_batch, get_pair. Could you please add it or tell a little more about what they should do?