Open loganbruns opened 5 years ago
Feed-forward convolutional architecture with self-attention. Self-Attention as described in Self-Attention Generative Adversarial Networks by Zhang, Goodfellow, Metaxas, and Odena added to the tower+pool architecture.
As discussed in issues #28 and #27.
Feed-forward convolutional architecture with self-attention. Self-Attention as described in Self-Attention Generative Adversarial Networks by Zhang, Goodfellow, Metaxas, and Odena added to the tower+pool architecture.
As discussed in issues #28 and #27.