Closed gowthamkishore3799 closed 2 days ago
ctx := context.Background()
client, err := openai.New(openai.WithModel("gpt-3.5-turbo")))
content := []llms.MessageContent{
llms.TextParts(llms.ChatMessageTypeSystem, "You are a company branding design wizard."),
llms.TextParts(llms.ChatMessageTypeHuman, "What would be a good company name a company that makes colorful socks?"),
}
//add type in request
opts := llms.CallOptions{
JSONMode: true,
}
resp, err := client.GenerateContent(ctx, content,opts)
fmt.Println(resp.Choices)
for _, message := range resp.Choices {
fmt.Println(message.Content)
}
return err
``
@devalexandre Yes, I was able to achieve it with the following code:
response, err := client.GenerateContent(ctx, content, llms.WithJSONMode())
However, I noticed that there are unused keys for every OpenAI LLM, specifically "openai.WithResponseFormat(responseFormat)", which isn't being utilized. I wanted to report this issue here.
@devalexandre Yes, I was able to achieve it with the following code:
response, err := client.GenerateContent(ctx, content, llms.WithJSONMode())
However, I noticed that there are unused keys for every OpenAI LLM, specifically "openai.WithResponseFormat(responseFormat)", which isn't being utilized. I wanted to report this issue here.
this is not BUG as we have option base but for each LLM, use some options, never use all.
maybe we could return a error if some option aren't used.
While using openai.WithResponseFormat with the response format set to JSON, i could see the req.ResponseFormat as nil coz of improper handling in code
Upon inspecting the code, I noticed that there is a check for an unused key in the JSON format at the following line:
Upon inspecting the code, I noticed that there is a check for an unused key in the JSON format at the following line: https://github.com/tmc/langchaingo/blob/7eb662b22a5919b8e9daaa4c20100f26b9096b69/llms/openai/openaillm.go#L114,
Whereas its working as expected in llm package