vercel / modelfusion

The TypeScript library for building AI applications.
https://modelfusion.dev
MIT License
1.06k stars 76 forks source link

OpenAI Azure streaming does not work #185

Closed lgrammel closed 8 months ago

lgrammel commented 8 months ago

I have no problem using generateText through Azure, but using streamText with the same parameters has been consistently reporting errors. Aren't the parameters of these two methods supposed to be the same?

 const textStream = await streamText(
    new OpenAIChatModel({
        api: new AzureOpenAIApiConfiguration({
            apiKey: process.env.AZURE_OPENAI_API_KEY,
            resourceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
            deploymentId: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
            apiVersion: process.env.AZURE_OPENAI_API_VERSION,
        }),
        model: "gpt-3.5-turbo",
    }), [
    OpenAIChatMessage.system("You are a story writer. Write a story about:"),
    OpenAIChatMessage.user("A robot learning to love"),
]);
Error:
'Error: JSONParseError: JSON parsing failed: Text: {"id":"","object":"","created":0,"model":"","prompt_filter_results":[{"prompt_index":0,"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"choices":[]}.\n' +
      'Error message: [\n' +
      '  {\n' +
      '    "received": "",\n' +
      '    "code": "invalid_literal",\n' +
      '    "expected": "chat.completion.chunk",\n' +
      '    "path": [\n' +
      '      "object"\n' +
      '    ],\n' +
      '    "message": "Invalid literal value, expected \"chat.completion.chunk\""\n' +
      '  }\n' +
      ']\n' +
      '    at AIService.streamText (/Users/popmart/song/src/acs/apps/server/dist/main.js:32952:19)\n' +
      '    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)'
lgrammel commented 8 months ago

Resolved with v0.74.1