alexrudall / ruby-openai

OpenAI API + Ruby! 🤖❤️ NEW: Assistant Vector Stores
MIT License
2.73k stars 321 forks source link

Add new /batches endpoint for batch of requests run asynchronously #454

Closed simonx1 closed 5 months ago

simonx1 commented 5 months ago

Summary

This pull request adds support for the new Batches endpoint in the OpenAI Ruby client library. The Batches endpoint allows creating and managing large batches of API requests to run asynchronously. Currently, only the /v1/chat/completions endpoint is supported for batches.

Files Changed

README.md

lib/openai.rb

lib/openai/batches.rb (new file)

lib/openai/client.rb

spec/fixtures/cassettes/batch_cancel.yml (new file)

spec/fixtures/cassettes/batch_cancel_setup.yml (new file)

spec/fixtures/cassettes/batches_create.yml (new file)

spec/fixtures/cassettes/batches_list.yml (new file)

spec/fixtures/cassettes/batches_list_setup.yml (new file)

spec/fixtures/cassettes/batches_retrieve.yml (new file)

spec/fixtures/cassettes/batches_retrieve_setup.yml (new file)

spec/openai/client/batches_spec.rb (new file)

Reason for Changes

The Batches endpoint is a new feature provided by the OpenAI API that allows running large batches of API requests asynchronously. Adding support for this endpoint in the Ruby client library enables users to leverage this functionality and run batch operations more efficiently.

Impact of Changes

These changes introduce a new way of interacting with the OpenAI API through the Batches endpoint. Users can now create, manage, and monitor large batches of API requests using the provided methods in the OpenAI::Batches class.

The addition of the Batches endpoint does not affect existing functionality, and all other endpoints and methods remain unchanged.

Test Plan

The changes have been thoroughly tested using the following approach:

Additional Notes

Please review the changes and provide any feedback or suggestions for improvement. Thank you!

All Submissions:

alexrudall commented 5 months ago

This is an awesome PR