momegas / megabots

🤖 State-of-the-art, production ready LLM apps made mega-easy, so you don't have to build them from scratch 🤯 Create a bot, now 🫵
MIT License
347 stars 35 forks source link

Add option for models to provide sources in QnA #9

Closed momegas closed 1 year ago

momegas commented 1 year ago

Is your feature request related to a problem? Please describe. During bot instanciation the user should be able to specify if they need sources in their responses. This is especially true with the qna-over-docs bots.

Describe the solution you'd like

qnabot = bot("qna-over-docs", sources=True)

You can use the load_qa_with_sources_chain from LangChain but in my tests it failed when I tries to change the prompt. Maybe I was doing something wrong, but I also opened an issue

Describe alternatives you've considered There is a chance that the ready-made chains provided by LangChain won't be enough after some time. But this is for later I think.

Additional context NA