MLSysOps / MLE-agent

🤖 MLE-Agent: Your intelligent companion for seamless AI engineering and research. 🔍 Integrate with arxiv and paper with code to provide better code/research plans 🧰 OpenAI, Anthropic, Ollama, etc supported. :fireworks: Code RAG
https://repx.app/
MIT License
989 stars 41 forks source link

Continuous batching query #139

Closed leeeizhang closed 6 days ago

leeeizhang commented 3 weeks ago

https://platform.openai.com/docs/guides/batch/overview 1724574870797

dosubot[bot] commented 3 weeks ago

The MLE-agent repository does not currently have any specific integrations or modules that interact with OpenAI's Batch API . However, the repository does include configuration settings for OpenAI's API in general, which could potentially be adapted for batch processing. These settings are found in the mle/cli.py file, where the OpenAI API key is stored in the project.yml configuration file [1].

Additionally, the repository includes utility functions for making HTTP requests to external APIs, such as the _make_request method in the GitHubIntegration class [2]. This could serve as a starting point for implementing batch processing with OpenAI's Batch API.

For more detailed information on how to handle large datasets or asynchronous processing, you might need to refer to external documentation or resources, as the repository does not provide specific guidance on these topics .

To continue talking to Dosu, mention @dosu.