PromtEngineer / localGPT

Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
Apache License 2.0
19.8k stars 2.2k forks source link

Error In generating answer , prompt template #611

Open Komal-99 opened 10 months ago

Komal-99 commented 10 months ago

I have using TheBloke/Vicuna-13B-v1.3-German-GPTQ model as a load_full_model in Local GPT.

but when I am asking the query in res it is printing the Source data but result key is coming empty i.e from context it is not able to generate answer. I tried printing the prompt template and as it takes 3 param history, context and question . whenever prompt is passed to the text generation pipeline, context is going empty.

image as can be seen in highlighted text. Due to which model not returning any answer. I am not able to find the loophole can you help me.

@PromtEngineer

mkmittalofficial commented 10 months ago

I am facing similar issue while running vicuna model. @PromtEngineer