UMass-Foundation-Model / FlexAttention

Apache License 2.0
19 stars 4 forks source link

[Question] Where is the code related to the FlexAttention and high-resolution feature selection module proposed in the paper? #2

Closed lartpang closed 1 month ago

lartpang commented 2 months ago

Question

@senfu

This is a good work! Where is the code related to the FlexAttention and high-resolution feature selection module proposed in the paper? There doesn't seem to be a code snippet corresponding to these names in the code repository.

junyan-li-tri commented 1 month ago

Hi @lartpang, thanks for your interest. The FlexAttention and the high-resolution feature selection module is implemented in transformers/src/transformers/models/llama/modeling_llama.py, within the self attention impl. For example, this line.