mistralai / mistral-inference

Official inference library for Mistral models
https://mistral.ai/
Apache License 2.0
9.37k stars 817 forks source link

Mistral input context length limitation #102

Open DanYoto opened 7 months ago

DanYoto commented 7 months ago

Hi, I have used the source code here and downloaded the weight instruct-v0.2 from https://docs.mistral.ai/models/. And in the source code, I have set '''instruct: bool = True''' in main.py. I had the problem that when input context length is longer than 4096(sliding window size), the generated text started to make no sense (both with chunk = 4096 and without chunk). I am wondering did anyone meet the same problem?