Closed pedroresende closed 1 month ago
@pedroresende You can pass the response_format
to the chat()
method directly but not yet to the initializer.
llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
llm.chat(messages: [{role:"user", content:"hello"}], response_format: {type:"json_object"})
You will get the following error:
"error": {
"message": "'messages' must contain the word 'json' in some form, to use 'response_format' of type 'json_object'."
}
This works:
llm.chat(messages: [{role:"system", content:"json"}, {role:"user", content:"hello"}], response_format: {type:"json_object"})
#<Langchain::LLM::OpenAIResponse:0x000000012ae2dfa8
@model=nil,
@raw_response=
{"id"=>"chatcmpl-A4fnhlAvHOXrZfyamZvKh49e32lKF",
"object"=>"chat.completion",
"created"=>1725677357,
"model"=>"gpt-4o-mini-2024-07-18",
"choices"=>
[{"index"=>0,
"message"=>{"role"=>"assistant", "content"=>"\n{\"response\": \"Hello! How can I assist you today?\"}", "refusal"=>nil},
...
@pedroresende With this small change you should be able to set it in the initializer:
llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'], default_options: {response_format: {type: "json_object"}})
Please let me know if that works for you!
@pedroresende With this small change you should be able to set it in the initializer:
llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'], default_options: {response_format: {type: "json_object"}})
Please let me know if that works for you!
Hi @andreibondarev,
thanks once again for the excelente work and support, however it seems the answer is still content instead of json.
With this change I'm getting the following response
[{\"role\":\"system\",\"content\":\"You are a chat bot that answers questions in a pragmatic, confident, inspiring, yet empathetic tone about the information that you have stored in your vectorsearch database. If you don't know the answer, just say i don't know.\"},{\"role\":\"user\",\"content\":\"consider a business that provides a platform for people to fina and connect in their field. The platform should be easy to use, and provide a seamless experience for both the user and the experts. The platform should also be customizable, allowing users to tailor the experience to their specific needs and preferentes. suggest me 10 ideas for this project\"},{\"role\":\"assistant\",\"content\":\"\",\"tool_calls\":[{\"id\":\"call_XPHoahLCOeH9638hf3yaLBzV\",\"type\":\"function\",\"function\":{\"name\":\"langchain_tool_vectorsearch__similarity_search\",\"arguments\":\"{\\\"query\\\":\\\"ideas for a customizable platform for connecting people in their field\\\",\\\"k\\\":10}\"}}]},{\"role\":\"tool\",\"content\":\"#\\u003c#\\u003cClass:0x000000012489c090\\u003e:0x0000000167d9f6a8\\u003e\",\"tool_call_id\":\"call_XPHoahLCOeH9638hf3yaLBzV\"},{\"role\":\"assistant\",\"content\":\"I couldn't find specific ideas for a customizable platform for connecting people in their field in the database. However, I can provide you with some suggestions based on common features of such platforms:\\n\\n1. **Personalized Profiles**: Allow users to create detailed profiles showcasing their expertise, skills, and interests.\\n\\n2. **Matching Algorithm**: Implement an algorithm that suggests relevant connections based on user profiles and preferences.\\n\\n3. **Customizable Filters**: Enable users to filter and search for connections based on specific criteria such as industry, location, or skills.\\n\\n4. **Messaging System**: Provide a secure messaging system for users to communicate and network with each other.\\n\\n5. **Virtual Events**: Host virtual events like webinars, workshops, or networking sessions to facilitate connections and knowledge sharing.\\n\\n6. **Resource Library**: Offer a repository of resources such as articles, guides, and tools to help users in their field.\\n\\n7. **Feedback and Ratings**: Allow users to provide feedback and ratings on interactions with others to maintain quality connections.\\n\\n8. **Groups and Communities**: Create specialized groups or communities within the platform for users with similar interests to connect and collaborate.\\n\\n9. **Integration with Social Media**: Enable users to link their social media profiles to the platform for seamless networking across platforms.\\n\\n10. **Analytics Dashboard**: Provide users with insights into their networking activities, connections made, and engagement metrics to track their progress.\\n\\nThese ideas can help you create a user-friendly and customizable platform for connecting people in their field.\"}]
It only work if I request after with return the previous answer in json format
@pedroresende With this small change you should be able to set it in the initializer:
llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'], default_options: {response_format: {type: "json_object"}})
Please let me know if that works for you!
Hi @andreibondarev,
thanks once again for the excelente work and support, however it seems the answer is still content instead of json.
With this change I'm getting the following response
[{\"role\":\"system\",\"content\":\"You are a chat bot that answers questions in a pragmatic, confident, inspiring, yet empathetic tone about the information that you have stored in your vectorsearch database. If you don't know the answer, just say i don't know.\"},{\"role\":\"user\",\"content\":\"consider a business that provides a platform for people to fina and connect in their field. The platform should be easy to use, and provide a seamless experience for both the user and the experts. The platform should also be customizable, allowing users to tailor the experience to their specific needs and preferentes. suggest me 10 ideas for this project\"},{\"role\":\"assistant\",\"content\":\"\",\"tool_calls\":[{\"id\":\"call_XPHoahLCOeH9638hf3yaLBzV\",\"type\":\"function\",\"function\":{\"name\":\"langchain_tool_vectorsearch__similarity_search\",\"arguments\":\"{\\\"query\\\":\\\"ideas for a customizable platform for connecting people in their field\\\",\\\"k\\\":10}\"}}]},{\"role\":\"tool\",\"content\":\"#\\u003c#\\u003cClass:0x000000012489c090\\u003e:0x0000000167d9f6a8\\u003e\",\"tool_call_id\":\"call_XPHoahLCOeH9638hf3yaLBzV\"},{\"role\":\"assistant\",\"content\":\"I couldn't find specific ideas for a customizable platform for connecting people in their field in the database. However, I can provide you with some suggestions based on common features of such platforms:\\n\\n1. **Personalized Profiles**: Allow users to create detailed profiles showcasing their expertise, skills, and interests.\\n\\n2. **Matching Algorithm**: Implement an algorithm that suggests relevant connections based on user profiles and preferences.\\n\\n3. **Customizable Filters**: Enable users to filter and search for connections based on specific criteria such as industry, location, or skills.\\n\\n4. **Messaging System**: Provide a secure messaging system for users to communicate and network with each other.\\n\\n5. **Virtual Events**: Host virtual events like webinars, workshops, or networking sessions to facilitate connections and knowledge sharing.\\n\\n6. **Resource Library**: Offer a repository of resources such as articles, guides, and tools to help users in their field.\\n\\n7. **Feedback and Ratings**: Allow users to provide feedback and ratings on interactions with others to maintain quality connections.\\n\\n8. **Groups and Communities**: Create specialized groups or communities within the platform for users with similar interests to connect and collaborate.\\n\\n9. **Integration with Social Media**: Enable users to link their social media profiles to the platform for seamless networking across platforms.\\n\\n10. **Analytics Dashboard**: Provide users with insights into their networking activities, connections made, and engagement metrics to track their progress.\\n\\nThese ideas can help you create a user-friendly and customizable platform for connecting people in their field.\"}]
It only work if I request after with
return the previous answer in json format
This should work now! If you're using OpenAI, however, one of the message contents needs to mention the string "json" somewhere.
Is your feature request related to a problem? Please describe.
Currently it's not possible to define the
response_format
even though the open ai ruby library supporting it https://github.com/alexrudall/ruby-openai/tree/main?tab=readme-ov-file#json-modeDescribe the solution you'd like
Allow the possibility to define the response format when initializing the Langchain::LLM::OpenAI