marella / chatdocs

Chat with your documents offline using AI.
MIT License
683 stars 97 forks source link

Custom LangChain prompt via `config.retriever.custom_prompt` #84

Open matteocargnelutti opened 11 months ago

matteocargnelutti commented 11 months ago

This PR allows for replacing LangChain's default prompt via the retriever.custom_prompt property of the config object.

AlexPerkin commented 10 months ago

Could you please provide an example chatdocs.yml for a custom LangChain prompt. Thank you in advance

matteocargnelutti commented 10 months ago

For sure. Any prompt compatible with LangChain's PromptTemplate should do.

In chatdocs.yml:

retriever:
  custom_prompt: "
    Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.

    {context}

    Question: {question}
    Helpful Answer:"
AlexPerkin commented 10 months ago

For my part, I would like to offer a prompt designed for Llama 2, which significantly improves the quality of answers:

retriever:
  custom_prompt: "[INST] <<SYS>>Use the following pieces of context to answer the question at the end. Let's think step by step. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>

CONTEXT:

{context}

Question: {question}
[/INST]

Helpful Answer:"