Closed jacksbox closed 2 years ago
Hello @jacksbox I'm glad you want to contribute to Haystack! However it's not too evident to me how you imagine this GTP3-based Reader to work. Can you flesh out your idea a bit more? Once we have a clear idea of the effort required it's going to be easier to see if it's worth it or not :wink:
The idea as a rough outline:
we do something like this in my company and it works quite nicely for question answering in natural language.
Ok, thanks for the description! I see where this is going. We call this approach "Long Form Question Answering" and we have a node called RAGenerator
(https://haystack.deepset.ai/components/generator) that does something similar to what you propose, with the difference that it works locally with a Retrieval-Augmented generation model (https://arxiv.org/abs/2005.11401).
I propose you read a bit how it works and how it's used in Haystack pipelines, for example here: https://haystack.deepset.ai/tutorials/retrieval-augmented-generation or here: https://haystack.deepset.ai/tutorials/lfqa Once you get more familiar with the approach, it should be much easier to contribute your own answer generator based on GPT-3!
I'll be happy to help you along the way, so let me know if you need any help or something is not clear about how LFQA works in Haystack :slightly_smiling_face: Looking forward to see this new node!
Hello @jacksbox, just checking, did you eventually tried to implement this? I'm really curious how is it going if you did :slightly_smiling_face:
We are now working on integrating OpenAI's GPT-3 model via their API into Haystack in the following PR: https://github.com/deepset-ai/haystack/pull/2605
It would be cool if gpt-3/openai or other services could be used as reader. Instead of running a local modal, a request to the service could be made (containing the preselection).
If this feature has demand, I would volunteer to integrate it :)