Closed lartpang closed 1 month ago
Hi @lartpang, thanks for your interest. The FlexAttention and the high-resolution feature selection module is implemented in transformers/src/transformers/models/llama/modeling_llama.py
, within the self attention impl. For example, this line.
Question
@senfu
This is a good work! Where is the code related to the FlexAttention and high-resolution feature selection module proposed in the paper? There doesn't seem to be a code snippet corresponding to these names in the code repository.