Open xitex opened 3 months ago
I try this with groq not openai (may be here is wrong tag). Thanks
I am getting the same error, it seems that
the method is just not implemented, but it is covered in the docs:
P.S. I am using the OpenAI implementation from langchain_openai
I have the same error with version 0.1.15.
Thanks! :)
Exploring https://github.com/langchain-ai/langchain-google/blob/main/libs/vertexai/langchain_google_vertexai/chat_models.py#L733 I have found that when I pass this model:
## Data model
class code(BaseModel):
"""Code output"""
prefix: str = Field(description="Description of the problem and approach")
imports: str = Field(description="Code block import statements")
code: str = Field(description="Code block not including import statements")
This way ...
model = ChatVertexAI(model_name="gemini-1.5-pro-preview-0409", convert_system_message_to_human=True)
llm_with_tool = model.bind(functions=[code])
parser = PydanticOutputFunctionsParser(pydantic_schema=code)
returns
#!/bin/bash
PROJECT_ID="GCP-PROJECT"
MODEL_ID="gemini-pro"
# Execute the curl command and capture the response
response=$(curl -s \
-X POST \
-H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
-H "Content-Type: application/json" \
"https://us-central1-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/us-central1/publishers/google/models/${MODEL_ID}:generateContent" -d \
$'{
"contents": {
"role": "user",
"parts": {
"text": "I am passing text key 'foo' to my prompt and want to process it with a function, process_text(...). prior to the prompt. how can I do this using LCEL?"
}
},
"tools": {
"function_declarations": {
"name": "code",
"description": "Code output",
"parameters": {
"type": "OBJECT",
"description": "Code output",
"properties": {
"prefix": {
"type": "STRING",
"description": "Description of the problem and approach"
}
},
"properties": {
"imports": {
"type": "STRING",
"description": "Code block import statements"
}
},
"properties": {
"code": {
"type": "STRING",
"description": "Code block not including import statements"
},
},
"required": "prefix",
"required": "imports",
"required": "code"
},
}
},
"generation_config": {}
}')
# Print the response
echo "$response"
Which Gemini returns a 500 error. If I change the type OBJECT to ARRAY works. Not sure if this related to how this line works: https://github.com/langchain-ai/langchain-google/blob/main/libs/vertexai/langchain_google_vertexai/functions_utils.py
Hey! I got the error with OpenAI. Makes the call to the base.py file:
@xitex @jovi-s @juancalvof @william-ingold Please upgrade to the latest langchain-groq. You should no longer get the NotImplementedError
.
@sepiatone awesome thanks!!!
@jovi-s Could you be running an older version of the langchain-openai package?
@sepiatone I got it working! Thank you so much for your help :)
Any plans implementing it for langchain-aws? It's already implemented for langchain-anthropic but can't be used when calling anthropic models through bedrock
@sepiatone How did you get this working? Even though I'm using the latest versions of langchain-core and langchain-openai, I'm still getting this error.
Make sure to update packages. Check your constraints in the.toml
file because you may be doing a poetry update
, but a constraint could be limiting the upgrade. To check for this, execute poetry show --outdated
. The yellow ones will be the ones you may not updating because of a constraint (yours or from a package)
@juancalvof I updated all the packages and these are my langchain packages' versions.
langchain== 0.1.20 langchain==0.1.52 langchain-openai==0.1.7
Still the issue was there
@sepiatone How did you get this working? Even though I'm using the latest versions of langchain-core and langchain-openai, I'm still getting this error.
Please post a MRE and somebody could help.
Same issues with:
Any plans implementing it for langchain-aws? It's already implemented for langchain-anthropic but can't be used when calling anthropic models through bedrock
I've requested support for this in Implement with_structured_output
for ChatBedrock from langchain-aws · langchain-ai/langchain · Discussion #22701. Also, I've opened a stack overflow post Is there a compatibility table between Bedrock Claude and Anthropic Claude APIs - Stack Overflow with the goal of discovering what langchain-specific workarounds are possible.
Facing same error with langchain-0.2.5 and langchain-core-0.2.9. I tried upgrading langchain to latest. I am using ChatOpenAI from langchain_openai.
Checked other resources
Example Code
model = ChatGroq( model_name="mixtral-8x7b-32768" )
class Joke(BaseModel): setup: str = Field(description="The setup of the joke") punchline: str = Field(description="The punchline to the joke")
model_with_structure = model.with_structured_output(Joke, method="json_mode") f = model_with_structure.invoke( "Tell me a joke about cats, respond in JSON with
setup
andpunchline
keys" )Error Message and Stack Trace (if applicable)
python/3_11/venv/lib/python3.11/site-packages/langchain_core/_api/beta_decorator.py:87: LangChainBetaWarning: The function
model_with_structure = model.with_structured_output(Joke, method="json_mode")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/python/3_11/venv/lib/python3.11/site-packages/langchain_core/_api/beta_decorator.py", line 110, in warning_emitting_wrapper
return wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/python/3_11/venv/lib/python3.11/site-packages/langchain_core/language_models/base.py", line 204, in with_structured_output
raise NotImplementedError()
NotImplementedError
with_structured_output
is in beta. It is actively being worked on, so the API may change. warn_beta( Traceback (most recent call last): File "/python/3_11/wp_app/src/aibro_langchain.py", line 25, inDescription
Try to get structured output from groq
System Info
langchain==0.1.14 langchain-community==0.0.31 langchain-core==0.1.40 langchain-groq==0.0.1 langchain-openai==0.0.5
Mac python. 3.11