Future-House / paper-qa

High accuracy RAG for answering questions from scientific documents with citations
Apache License 2.0
6.44k stars 617 forks source link

Implement support to BatchAPIs to gather evidence #687

Open maykcaldas opened 1 week ago

maykcaldas commented 1 week ago

This PR implements support to send requests to OpenAI and Anthropic batch APIs. Due to the parallel nature of gathering evidence and summarizing all candidate papers, we plan to use the batch API when possible.

The use of the batch API depends on Settings.use_batch_in_summary. Therefore, paperqa workflows would still be unchanged in case this setting is set to False (default). Currently, using a batch keeps the process busy while the batch isn't finished on the LLM provider side, which could take up to 24 hours. This scaling issue will be addressed in another PR.

Task list