This pull request adds support for the new Batches endpoint in the OpenAI Ruby client library. The Batches endpoint allows creating and managing large batches of API requests to run asynchronously. Currently, only the /v1/chat/completions endpoint is supported for batches.
Files Changed
README.md
Added a new section "Batches" to document how to use the new Batches endpoint.
Provided examples of creating a batch, retrieving batch information, canceling a batch, and listing all batches.
Explained the format of the input JSONL file and the output/error files.
lib/openai.rb
Required the new openai/batches file to include the Batches module.
lib/openai/batches.rb (new file)
Implemented the OpenAI::Batches class to interact with the Batches endpoint.
Added methods for listing batches, retrieving a specific batch, creating a new batch, and canceling a batch.
lib/openai/client.rb
Added a new batches method to initialize and return an instance of OpenAI::Batches.
spec/fixtures/cassettes/batch_cancel.yml (new file)
Added a new fixture file for testing batch cancellation.
spec/fixtures/cassettes/batch_cancel_setup.yml (new file)
Added a new fixture file for setting up the batch cancellation test.
spec/fixtures/cassettes/batches_create.yml (new file)
Added a new fixture file for testing batch creation.
spec/fixtures/cassettes/batches_list.yml (new file)
Added a new fixture file for testing listing batches.
spec/fixtures/cassettes/batches_list_setup.yml (new file)
Added a new fixture file for setting up the batch listing test.
spec/fixtures/cassettes/batches_retrieve.yml (new file)
Added a new fixture file for testing retrieving a specific batch.
spec/fixtures/cassettes/batches_retrieve_setup.yml (new file)
Added a new fixture file for setting up the batch retrieval test.
spec/openai/client/batches_spec.rb (new file)
Added test cases for the new Batches endpoint methods.
Reason for Changes
The Batches endpoint is a new feature provided by the OpenAI API that allows running large batches of API requests asynchronously. Adding support for this endpoint in the Ruby client library enables users to leverage this functionality and run batch operations more efficiently.
Impact of Changes
These changes introduce a new way of interacting with the OpenAI API through the Batches endpoint. Users can now create, manage, and monitor large batches of API requests using the provided methods in the OpenAI::Batches class.
The addition of the Batches endpoint does not affect existing functionality, and all other endpoints and methods remain unchanged.
Test Plan
The changes have been thoroughly tested using the following approach:
Added new test cases in spec/openai/client/batches_spec.rb to cover the new Batches endpoint methods.
Utilized VCR cassettes to record and replay API interactions during testing.
Ensured that all existing tests continue to pass after introducing the new changes.
Additional Notes
The Batches endpoint is currently in beta and only supports the /v1/chat/completions endpoint. Support for other endpoints may be added in the future.
Detailed documentation and examples have been provided in the README to guide users on how to use the new Batches endpoint effectively.
Please review the changes and provide any feedback or suggestions for improvement. Thank you!
Summary
This pull request adds support for the new Batches endpoint in the OpenAI Ruby client library. The Batches endpoint allows creating and managing large batches of API requests to run asynchronously. Currently, only the
/v1/chat/completions
endpoint is supported for batches.Files Changed
README.md
lib/openai.rb
openai/batches
file to include the Batches module.lib/openai/batches.rb
(new file)OpenAI::Batches
class to interact with the Batches endpoint.lib/openai/client.rb
batches
method to initialize and return an instance ofOpenAI::Batches
.spec/fixtures/cassettes/batch_cancel.yml
(new file)spec/fixtures/cassettes/batch_cancel_setup.yml
(new file)spec/fixtures/cassettes/batches_create.yml
(new file)spec/fixtures/cassettes/batches_list.yml
(new file)spec/fixtures/cassettes/batches_list_setup.yml
(new file)spec/fixtures/cassettes/batches_retrieve.yml
(new file)spec/fixtures/cassettes/batches_retrieve_setup.yml
(new file)spec/openai/client/batches_spec.rb
(new file)Reason for Changes
The Batches endpoint is a new feature provided by the OpenAI API that allows running large batches of API requests asynchronously. Adding support for this endpoint in the Ruby client library enables users to leverage this functionality and run batch operations more efficiently.
Impact of Changes
These changes introduce a new way of interacting with the OpenAI API through the Batches endpoint. Users can now create, manage, and monitor large batches of API requests using the provided methods in the
OpenAI::Batches
class.The addition of the Batches endpoint does not affect existing functionality, and all other endpoints and methods remain unchanged.
Test Plan
The changes have been thoroughly tested using the following approach:
spec/openai/client/batches_spec.rb
to cover the new Batches endpoint methods.Additional Notes
/v1/chat/completions
endpoint. Support for other endpoints may be added in the future.Please review the changes and provide any feedback or suggestions for improvement. Thank you!
All Submissions: