datamllab / LongLM

[ICML'24 Spotlight] LLM Maybe LongLM: Self-Extend LLM Context Window Without Tuning
https://arxiv.org/pdf/2401.01325.pdf
MIT License
597 stars 59 forks source link

没有报错,但是显示不出来实验结果 #15

Closed guanzy2012 closed 7 months ago

guanzy2012 commented 8 months ago

Tokens of Prompt: 5144 Passkey target: 89427

This is a friendly reminder - the current text generation call will exceed the model's predefined maximum length (4096). Depending on the model, you may observe exceptions, performance degradation, or nothing at all. Llama2: [What is the pass key? The pass key is 。。。. 。] SelfExtend: [What is the pass key? The pass key is 。。。. 。]

guanzy2012 commented 8 months ago

the version of transformers is not 4.32.0

Mooler0410 commented 8 months ago

Could you please elaborate more on your environment and your running scripts? It seems that both in our and other folks' tests, the current codes work well.