Closed richardbaihe closed 3 years ago
This paper proposes to visualize attribute weights instead of attention weight to analyze tokens' importance for transformer-based NLP models. The attribute weights are defined as below:
Results:
This paper proposes to visualize attribute weights instead of attention weight to analyze tokens' importance for transformer-based NLP models. The attribute weights are defined as below:
Results: