mit-han-lab / streaming-llm

[ICLR 2024] Efficient Streaming Language Models with Attention Sinks
https://arxiv.org/abs/2309.17453
MIT License
6.38k stars 355 forks source link

The reason for the importance of the initial token. #59

Open freyamom opened 8 months ago

freyamom commented 8 months ago

Hi, I am surprise about the importance of initial token. Attention value show the correlation of privious token and the current token. But the initial token has the huge correlation with the far far away position token. I was trying to figure out the reason. I guess maybe its related to the masked self attention layer. Frist, there are a few token to calculate the attention value. Every i+k token will ref to the first token. And we will force the token after current token to 0. So in the begining, the initial token will be very important!

I am not sure this reason if make sense or not. Thanks a lot!