Closed YuboHe closed 1 year ago
it does not support streaming but everything else should work. but i can make it compatible in 0.3.10, give me a day.
Might be that openai.error.InvalidRequestError: 'content' is a required property - 'messages.3'
means that max retries is adding something
While trying the examples provided in the README using Azure OpenAI, I've encountered some issue with the following code:
from pydantic import BaseModel, ValidationError, BeforeValidator
from typing_extensions import Annotated
from instructor import llm_validator
class QuestionAnswer(BaseModel):
question: str
answer: Annotated[
str,
BeforeValidator(llm_validator("don't say objectionable things"))
]
try:
qa = QuestionAnswer(
question="What is the meaning of life?",
answer="The meaning of life is to be evil and steal",
)
except ValidationError as e:
print(e)
The error message is the following:
InvalidRequestError
Traceback (most recent call last)
[/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb) Cell 11 line 1
[7](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=6) answer: Annotated[
[8](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=7) str,
[9](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=8) BeforeValidator(llm_validator("don't say objectionable things"))
[10](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=9) ]
[12](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=11) try:
---> [13](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=12) qa = QuestionAnswer(
[14](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=13) question="What is the meaning of life?",
[15](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=14) answer="The meaning of life is to be evil and steal",
[16](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=15) )
[17](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=16) except ValidationError as e:
[18](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=17) print(e)
[... skipping hidden 1 frame]
File [~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:67](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:67), in llm_validator.<locals>.llm(v)
[66](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:66) def llm(v):
---> [67](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:67) resp = openai.ChatCompletion.create(
[68](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:68) functions=[Validator.openai_schema],
[69](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:69) function_call={"name": Validator.openai_schema["name"]},
[70](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:70) messages=[
[71](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:71) {
[72](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:72) "role": "system",
...
[89](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py:89) )
[90](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py:90) else:
[91](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py:91) if model is None and engine is None:
InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>
Output is truncated. View as a [scrollable element](command:cellOutput.enableScrolling?7d480287-eba0-4dee-b8e4-cd8f0030d85f) or open in a [text editor](command:workbench.action.openLargeOutput?7d480287-eba0-4dee-b8e4-cd8f0030d85f). Adjust cell output [settings](command:workbench.action.openSettings?%5B%22%40tag%3AnotebookOutputLayout%22%5D)...
I then checked in the validators.py the function llm_validator
and it doesn't seem to have the code to use Azure OpenAI
def llm(v):
resp = openai.ChatCompletion.create(
functions=[Validator.openai_schema],
function_call={"name": Validator.openai_schema["name"]},
messages=[
{
"role": "system",
"content": "You are a world class validation model. Capable to determine if the following value is valid for the statement, if it is not, explain why and suggest a new value.",
},
{
"role": "user",
"content": f"Does `{v}` follow the rules: {statement}",
},
],
model=model,
temperature=temperature,
) # type: ignore
resp = Validator.from_response(resp)
This code explain why I have this error.
How to use Instructor with Azure OpenAI?
oh interesting i see, theres some hard coded model
vs engine
I'll work on fixing that
For now I'd just use this code, copy paste this and run it locally.
from pydantic import Field
from typing import Optional
import instructor
import openai
class Validator(instructor.OpenAISchema):
"""
Validate if an attribute is correct and if not,
return a new value with an error message
"""
is_valid: bool = Field(
default=True,
description="Whether the attribute is valid based on the requirements",
)
reason: Optional[str] = Field(
default=None,
description="The error message if the attribute is not valid, otherwise None",
)
fixed_value: Optional[str] = Field(
default=None,
description="If the attribute is not valid, suggest a new value for the attribute",
)
def llm_validator(
statement: str,
allow_override: bool = False,
engine: str = "gpt-35-turbo",
temperature: float = 0,
):
"""
Create a validator that uses the LLM to validate an attribute
## Usage
```python
from instructor import llm_validator
from pydantic import BaseModel, Field, field_validator
class User(BaseModel):
name: str = Annotated[str, llm_validator("The name must be a full name all lowercase")]
age: int = Field(description="The age of the person")
try:
user = User(name="Jason Liu", age=20)
except ValidationError as e:
print(e)
```
1 validation error for User
name
The name is valid but not all lowercase (type=value_error.llm_validator)
```
Note that there, the error message is written by the LLM, and the error type is `value_error.llm_validator`.
Parameters:
statement (str): The statement to validate
model (str): The LLM to use for validation (default: "gpt-3.5-turbo-0613")
temperature (float): The temperature to use for the LLM (default: 0)
"""
def llm(v):
resp = openai.ChatCompletion.create(
functions=[Validator.openai_schema],
function_call={"name": Validator.openai_schema["name"]},
messages=[
{
"role": "system",
"content": "You are a world class validation model. Capable to determine if the following value is valid for the statement, if it is not, explain why and suggest a new value.",
},
{
"role": "user",
"content": f"Does `{v}` follow the rules: {statement}",
},
],
engine=engine,
temperature=temperature,
) # type: ignore
resp = Validator.from_response(resp)
# If the response is not valid, return the reason, this could be used in
# the future to generate a better response, via reasking mechanism.
assert resp.is_valid, resp.reason
if allow_override and not resp.is_valid and resp.fixed_value is not None:
# If the value is not valid, but we allow override, return the fixed value
return resp.fixed_value
return v
return llm
does now!
Just as an FYI to others, I ran into this error while using instructor v0.3.5 with Azure:
openai.NotFoundError: Error code: 404 - {'error': {'message': 'Unrecognized request argument supplied: functions', 'type': 'invalid_request_error', 'param': None, 'code': None}}
The fix was to switch away from the 2023-05-15
version of the Azure API. Version 2023-09-01-preview
did the trick, though I haven't checked whether there are others that also work.
When I use Azure OpenAI, I often encounter errors, but occasionally it succeeds. I am not sure if the current instructor can use the Azure OpenAI API. Below is the function and frequent error message.
Describe the bug openai.error.InvalidRequestError: 'content' is a required property - 'messages.3'
To Reproduce Steps to reproduce the behavior:
Expected behavior A clear and concise description of what you expected to happen.
Screenshots Traceback (most recent call last): File "C:\Users\yubo.he\Desktop\LLM_AE_Extrator\run.py", line 92, in
ade_report: Report = generate_report(text_chunks)
File "C:\Users\yubo.he\Desktop\LLM_AE_Extrator\run.py", line 47, in generate_report
new_updates = openai.ChatCompletion.create(
File "C:\Users\yubo.he\AppData\Local\Continuum\anaconda3\envs\syngenta\lib\site-packages\instructor\patch.py", line 162, in new_chatcompletion_sync
response, error = retry_sync(
File "C:\Users\yubo.he\AppData\Local\Continuum\anaconda3\envs\syngenta\lib\site-packages\instructor\patch.py", line 117, in retry_sync
response = func(*args, *kwargs)
File "C:\Users\yubo.he\AppData\Local\Continuum\anaconda3\envs\syngenta\lib\site-packages\openai\api_resources\chat_completion.py", line 25, in create
return super().create(args, **kwargs)
File "C:\Users\yubo.he\AppData\Local\Continuum\anaconda3\envs\syngenta\lib\site-packages\openai\api_resources\abstract\engine_apiresource.py", line 155, in create
response, , api_key = requestor.request(
File "C:\Users\yubo.he\AppData\Local\Continuum\anaconda3\envs\syngenta\lib\site-packages\openai\api_requestor.py", line 299, in request
resp, got_stream = self._interpret_response(result, stream)
File "C:\Users\yubo.he\AppData\Local\Continuum\anaconda3\envs\syngenta\lib\site-packages\openai\api_requestor.py", line 710, in _interpret_response
self._interpret_response_line(
File "C:\Users\yubo.he\AppData\Local\Continuum\anaconda3\envs\syngenta\lib\site-packages\openai\api_requestor.py", line 775, in _interpret_response_line
raise self.handle_error_response(
openai.error.InvalidRequestError: 'content' is a required property - 'messages.3'
Desktop (please complete the following information):
Additional context Azure OpenAI version : 2023-08-01-preview