When I run sentiment example, the response result from chatglm is always '{\nsentiment : "xxx"\n}' which parse failed by JSON.Parse().
Environment:
I used fastchat and chatglm2-6b-4bit to mock openai api
Root cause:
Because chatglm2-6b-4bit has lower performance than chatgpt, it cannot return the correct result via the example
Solution:
Add one prompt in source code "Please note that ..."
function createRequestPrompt(request) {
return `You are a service that translates user requests into JSON objects of type "${validator.typeName}" according to the following TypeScript definitions:\n` +
`\`\`\`\n${validator.schema}\`\`\`\n` +
`Please note that the response string can be parsed to JSON object via JSON.Parse() such as {"aaa" : "bbb"}\n` +
`The following is a user request:\n` +
`"""\n${request}\n"""\n` +
`The following is the user request translated into a JSON object with 2 spaces of indentation and no properties with the value undefined:\n`
}
Result:
Now the response result from chatglm is '{"sentiment": "xxx"}' which can be parsed to JSON correctly
From anecdotal conversations, I've heard that something similar like "the response will be consumed by a program" has a similar effect. We should consider it.
Description:
When I run sentiment example, the response result from chatglm is always '{\nsentiment : "xxx"\n}' which parse failed by JSON.Parse().
Environment:
I used fastchat and chatglm2-6b-4bit to mock openai api
Root cause:
Because chatglm2-6b-4bit has lower performance than chatgpt, it cannot return the correct result via the example
Solution:
Add one prompt in source code "Please note that ..."
Result:
Now the response result from chatglm is '{"sentiment": "xxx"}' which can be parsed to JSON correctly