Tested on openai models without openai structured output supported, e.g. gpt 3.5 turbo, would yield the following logs.
Let's be explicit about supported models for each requestor in README, after #74 lands
(bespokelabs-curator-py3.12) (base) ➜ bella git:(CURATOR-28-add-a-lite-llm-backend-for-curator) ✗ python examples/poem.py
Processing parallel requests to OpenAI: 0%| | 0/1 [00:00<?, ?it/s]2024-11-21 02:02:34,153 - bespokelabs.curator.request_processor.openai_online_request_processor - WARNING - Request 0 failed with error {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}
2024-11-21 02:02:34,269 - bespokelabs.curator.request_processor.openai_online_request_processor - WARNING - Request 0 failed with error {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}
2024-11-21 02:02:34,387 - bespokelabs.curator.request_processor.openai_online_request_processor - WARNING - Request 0 failed with error {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}
2024-11-21 02:02:34,520 - bespokelabs.curator.request_processor.openai_online_request_processor - WARNING - Request 0 failed with error {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}
2024-11-21 02:02:34,622 - bespokelabs.curator.request_processor.openai_online_request_processor - WARNING - Request 0 failed with error {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}
2024-11-21 02:02:34,623 - bespokelabs.curator.request_processor.openai_online_request_processor - ERROR - Request {'model': 'gpt-3.5-turbo', 'messages': [{'role': 'user', 'content': 'Generate 10 diverse topics that are suitable for writing poems about.'}], 'response_format': {'type': 'json_schema', 'json_schema': {'name': 'output_schema', 'schema': {'properties': {'topics_list': {'description': 'A list of topics.', 'items': {'type': 'string'}, 'title': 'Topics List', 'type': 'array'}}, 'required': ['topics_list'], 'title': 'Topics', 'type': 'object'}}}} failed after all attempts.Saved errors [{'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}}, {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}}, {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}}, {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}}, {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}}] to /home/charlieji/.cache/curator/5a4940f3cbb921db/responses_0.jsonl
Processing parallel requests to OpenAI: 100%|████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1.62it/s]
2024-11-21 02:02:34,624 - bespokelabs.curator.request_processor.openai_online_request_processor - WARNING - 1 / 1 requests failed. Errors logged to /home/charlieji/.cache/curator/5a4940f3cbb921db/responses_0.jsonl.
Traceback (most recent call last):
File "/home/charlieji/workspace/bella/examples/poem.py", line 27, in <module>
topics: Dataset = topic_generator()
^^^^^^^^^^^^^^^^^
File "/home/charlieji/workspace/bella/src/bespokelabs/curator/prompter/prompter.py", line 146, in __call__
return self._completions(self._request_processor, dataset, working_dir)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/charlieji/workspace/bella/src/bespokelabs/curator/prompter/prompter.py", line 240, in _completions
dataset = request_processor.run(
^^^^^^^^^^^^^^^^^^^^^^
File "/home/charlieji/workspace/bella/src/bespokelabs/curator/request_processor/openai_online_request_processor.py", line 189, in run
dataset = self.create_dataset_files(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/charlieji/workspace/bella/src/bespokelabs/curator/request_processor/base_request_processor.py", line 325, in create_dataset_files
raise ValueError("All requests failed")
ValueError: All requests failed
Tested on openai models without openai structured output supported, e.g. gpt 3.5 turbo, would yield the following logs.
Let's be explicit about supported models for each requestor in README, after #74 lands