Closed Jacquesjh closed 3 months ago
Would it be possible to provide a representative output from the model and/or trace? Also, which model?
Any additional info you can provide on this one? We're happy to look into it. @Jacquesjh
I tried to test this out. One of our samples (js/testapps/flow-simple-ai) has a prompt with json output. It seems to be displaying the output in the prompt playground fine:
I am going to close this one for now as there was no information provided by the user. Please reopen if you have a specific use case with the issue and provide more details about it. Thanks!
@shrutip90 Maybe one thing to double check; in the screenshot the schema is not an object, but an array. Does that have any impact?
@shrutip90 Maybe one thing to double check; in the screenshot the schema is not an object, but an array. Does that have any impact?
Thanks @MichaelDoyle. Tested that out too. I changed the sample to output an array of reasonings:
model: vertexai/gemini-1.0-pro input: schema: question: string output: format: json schema: answer: string, the answer to the question id: string, the selected id of the saying reasoning(array): string, why the saying applies to the question
and it still works:
Describe the bug When experimenting in the prompt playground, the model response will not appear if the output is a JSON. The only way to see the output is viewing it's trace. When the output is text it works fine.
Screenshots