UMass-Foundation-Model / FlexAttention

Apache License 2.0
30 stars 4 forks source link

[Question] Finding FlexAttn #5

Open Zsong1106 opened 3 months ago

Zsong1106 commented 3 months ago

Question

I have looked at the overall code but haven't found which module it is in yet. Is the FlexAttention module in that py file

senfu commented 2 months ago

The FlexAttention and the high-resolution feature selection module is implemented in transformers/src/transformers/models/llama/modeling_llama.py, within the self attention impl. For example, this line.