marella / chatdocs

Chat with your documents offline using AI.
MIT License
683 stars 97 forks source link

LLama 2 and Code LLama support? #73

Open Ananderz opened 12 months ago

Ananderz commented 12 months ago

I have been trying to get llama 2 models to function correctly. They start off ok but then all of them goes into a loop with repetitions or gibberish.

I haven't tried setting model_type:llama to something else, could it be that we need to add llama2 here instead?

model_type: llama

Possible to get any of the code llms to support this ?

Ananderz commented 12 months ago

I tried with llama-2 and llama2 and read the ctransformers documentation and realized its just llama.

The answer gets into a loop when using llama2 models:

The telecom industry is not not not not not not not not not not not not not not not

Like that, I read somewhere that it could be related to something RoPE but don't know how to set that!

Ananderz commented 11 months ago

Fixed it by implementing prompt template!