Open Zsong1106 opened 3 months ago
I have looked at the overall code but haven't found which module it is in yet. Is the FlexAttention module in that py file
The FlexAttention and the high-resolution feature selection module is implemented in transformers/src/transformers/models/llama/modeling_llama.py, within the self attention impl. For example, this line.
Question
I have looked at the overall code but haven't found which module it is in yet. Is the FlexAttention module in that py file