Open b-fg opened 1 year ago
That would be super helpful for all people working with physical systems :) .
A small note:
1 using the "convolution is all you need" approach, all what is needed is to be able to perform lifting + convs + projection; I am not familiar with this package, but I guess this should be straightforward / possibly already available?
2 if there is need to work with a truly continuous group of transformations, then some approach with steerable kernels may be needed. This is likely more work, but in the first place, the approach 1 may be enough, approach 2 is icing on the cake.
All this seems like a great feature set to prototype in a dedicated library! The general criteria we have for inclusion in core Flux is maturity (of both methods and code), ubiquity and cross-domain applicability. Flux also needs to be very careful about factors such as backwards compatibility and architectural fit, so a separate package should offer both more agility and room to experiment with design.
Group convolution is one of the member of geometric deep learning. Should be considered supported by geometric deep learning library based on Flux.jl. GeometricFlux.jl is a good place to support group convolutions and this model is listed at FluxML/GeometricFlux.jl#225. PRs are welcome.
Thanks for pointing it out @yuehhua, that's indeed what we are after.
Motivation and description
Group equivariant CNN (G-CNN) embed rotation invariance or (and) scale invariance on top of translation invariance in CNNs. Some references:
Is there support for these type of architectures in Flux? In PyTorch, an implementation on top of the main library is exemplified here, even though I am not sure if this is directly implemented nowadays.
Thanks!
Possible Implementation
No response