alejandro-ao / ask-multiple-pdfs

A Langchain app that allows you to chat with multiple PDFs
1.63k stars 933 forks source link

Issue with response completion #11

Open maotoledo opened 1 year ago

maotoledo commented 1 year ago

I've processed a document of a book chapter. But It gives me always same limited length response. image

jasonzhouxf commented 1 year ago

me too

kkatrina28 commented 1 year ago

Same issue here, how to finish the sentence?

DjangoMaker commented 1 year ago

try to adjust the chunk_size=1000 and chunk_overlap=200, test with 1500 and 250 respectively and see if that makes any difference.

mohitkapoor1230 commented 1 year ago

any update on this issue please? When I increased the chunk_size=1000 and chunk_overlap=200 then error comes from Inference API: "Input validation error: inputs must have less than 1024 tokens. Given: 1208" How to fix this response completion issue? Please assist @kkatrina28 @DjangoMaker @jasonzhouxf

SoniaGrh commented 10 months ago

try this : llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", task='text-generation', model_kwargs={'max_new_tokens': 200})