thunlp / InfLLM

The code of our paper "InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory"
MIT License
269 stars 21 forks source link

ZERO Score when using Origin settings #43

Open mazeyang opened 2 months ago

mazeyang commented 2 months ago

Hi, this is a great job! But I met some strange problems:

when I use mistral-origin.yaml and vicuna-origin.yaml, evaluation scores are 0.

More specifically, the pred of mistral-origin.yaml are many star symblos, like: " "

and the pred of vicuna-origin.yaml are "nobody"s, like: "nobody nobody nobody nobody nobody nobody nobody nobody nobody nobody nobody "

I have reproduced these results 3 times.

So, has anyone else met this problem?

transformers==4.37.2 torch==2.1.0

Using A100-80G