thunlp / InfLLM

The code of our paper "InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory"
MIT License
309 stars 29 forks source link

Clarification Needed: Why is topk=2 Set for Greedy Decoding? #48

Open Becomebright opened 3 months ago

Becomebright commented 3 months ago

Reference: https://github.com/thunlp/InfLLM/blob/main/inf_llm/chat.py#L168