Thank you for sharing your work here! It's very helpful. I am working on a RAG pipeline using Kendra, langchain and llama-2. I created a Kendra index and have been using llama-2 python script for Q&A. I see the prompts in the py file but they don't seem to be following the correct llama-2 format. Is that handled somewhere else in the code. I couldn't find where it is. Can you point me to it?
Hi,
Thank you for sharing your work here! It's very helpful. I am working on a RAG pipeline using Kendra, langchain and llama-2. I created a Kendra index and have been using llama-2 python script for Q&A. I see the prompts in the py file but they don't seem to be following the correct llama-2 format. Is that handled somewhere else in the code. I couldn't find where it is. Can you point me to it?
As per my understanding, this is the correct format to prompt Llama-2: https://huggingface.co/blog/llama2#how-to-prompt-llama-2
Let me know in case of any questions. Thanks!