-
Thank you for your nice work!
I would like to ask a question about the paper about One-Way attention. Why not used the One-Way attention mechanism on the Cross-Attention(Inter Frame) but only in in…
-
I'm trying to extend a project which implements a GNN using Battaglia et al.'s definition through the `MetaLayer` class.
I would like to include some attention mechanisms as defined [here](https://g…
-
A generic pytorch transformer layer
## 🚀 Feature
input:
a 2-dim Tensor [x_{00}, x_{01}, ... x_{0m_1}, x_{10}, ... x_{1m_1}, ... x_{km_k}]; each x is a vector.
and either (or both, but the…
-
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10
- TensorFlow version and how it was installed (source or binary): 2.2.0, binary
- TensorFlow-Addons ve…
-
Hello, thanks for your nice work.I have encountered an issue during my training process - I often see small grid-like noise artifacts in the generated images. I suspect this might be caused by the att…
-
### Feature Request
Is there a way to define a custom unary operator `getXAt` that takes an integer `i` as a parameter and returns `X_i`? This could possibly allow creating a similar mechanism to a…
-
Great works! And I have several questions about your work.
The Adaptive Channel Fusion module seems to be the attention mechanism, right? And I'm wondering how to visualize the attention map in order…
-
### Description
When we add `TestRunFeatureSuiteCalico`, we can see that `TestNSE_Composition` doesn't work properly.
The first Request (from the init-container) is usually fine, but the following Re…
-
### Describe the problem
I know that you have a lot to do without me! Guys, I think it's very important to have functionality: to directly transfer values from the layout to pages, to components an…
-
**Is your feature request related to a problem? Please describe.**
caching the key and value matrices during the self-attention mechanism to reduce computational complexity and improve inference sp…