Open fransilvionGenomica opened 1 year ago
Are you interested in trying relative encodings for the implicit long convolution filter (HyenaFilter
) or a more traditional implementation of encodings that would work at the HyenaOperator
level? In our experience, the latter does not appear to affect performance much, since Hyena is not permutation equivariant.
We recevied requests for a version with KERPLE positional embeddings, so that might be something to consider.
Hi,
I was thinking more of the former (relative encodings for the implicit long conv filter), have you tried that?
Hello,
Is there a way to implement relative positional encodings with Hyena similar to what was done in the Transformer-XL paper? Any tips on how to implement that?