-
Is it possible to recover the attention scores from the Fast Attention module?
-
Hello
I am trying to train Flowtron on LJSpeech
Unfortunately after 24 hours of training, the attention weights still bad
Server configuration: 4 instances with 8xV100
![image](https://user-im…
-
nice work and highly configurable.
Is there a plan to increase the implementation of the attention mechanism?
-
Interesting thought of HTM, where SP's active columns get `boosted` from `anomalous` input, therefore creating a "short-term attention memory"
@subutai @cogmission
-
Hi Jianshu
Do you have code for attention visualization?
Thank you
-
It could be nice to have a feature, maybe for making a noise or the like, to actively draw attention to oneself. For an example as distraction in multiplayer from others.
-
Bro, could you replace the SE (Squeeze and Excitation) Block with CA (Coordinate Attention) block in the Efiicientnet? This method usually can improve the network performance. Thank u
The website of…
-
Hello!
I am doing a translation task and would like to try using flash attention in my model
In addition to the usual triangular mask, I also need to mask padding tokens so that the model does not p…
-
Hey,
I'm looking for a function that visualize the attention of every VIT layer (like Fig 4 in the article) do you provide it?
Thanks
-
hello.After I imported requirement.txt and setup.py, the following problems occurred when running train. Could you please answer them? Thank you
Traceback (most recent call last):
File "F:\MTR-mas…