Closed fisipro2 closed 5 months ago
🤖
Hello @fisipro2,
Great to see you again here in the LangChain repository! I hope you're doing well. Looking forward to assisting you with your issue.
It seems like you've got this under control, if you want help or have specific questions, let me know what I can do for you!
bump
bump
looks like we have a better answer here: https://github.com/langchain-ai/langchain/issues/12953#issuecomment-1796524810 or here: https://js.langchain.com/docs/integrations/chat/openai
looks like we have a better answer here: #12953 (comment) or here: https://js.langchain.com/docs/integrations/chat/openai
The kwarg method works, but breaks agents.
In this notebook under the JSON mode section, the response format is set in Python like this:
chat = ChatOpenAI(model="gpt-3.5-turbo-1106").bind(
response_format={"type": "json_object"}
)
It seems to work. If I removed "Return a JSON list." from the SystemMessage, I received an error confirming that the response_format was used.
Bumping this!
` llm = AzureChatOpenAI(
model_kwargs={"response_format": {type: "json_object"}},
)`
When running this via AgentExecutor
I get the following error
File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/json/__init__.py", line 238, in dumps **kw).encode(obj) File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/json/encoder.py", line 199, in encode chunks = self.iterencode(o, _one_shot=True) File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/json/encoder.py", line 257, in iterencode return _iterencode(o, 0) TypeError: keys must be str, int, float, bool or None, not type
Bump.
Bumping this!
` llm = AzureChatOpenAI(
model_kwargs={"response_format": {type: "json_object"}}, )`
When running this via
AgentExecutor
I get the following error
File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/json/__init__.py", line 238, in dumps **kw).encode(obj) File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/json/encoder.py", line 199, in encode chunks = self.iterencode(o, _one_shot=True) File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/json/encoder.py", line 257, in iterencode return _iterencode(o, 0) TypeError: keys must be str, int, float, bool or None, not type
This particular case is because you're missing " around the type key. Should be
model_kwargs={"response_format": {"type": "json_object"}}
its still not working, no? Or is it working for anyone?
@aiwalter it seems to be working for me on python with the latest release, though I haven't tested it with Agents yet. :)
Issue you'd like to raise.
When using the new gpt's json mode by setting response_format={ "type": "json_object" }, the langchain agent failed to parse the openai output, is there any plan to support that?
Suggestion:
No response