sgomez / ollama-ai-provider

Vercel AI Provider for running LLMs locally using Ollama
https://www.npmjs.com/package/ollama-ai-provider
Other
150 stars 18 forks source link

Issue when using the streamText #21

Closed SklyerX closed 3 months ago

SklyerX commented 3 months ago

While using the streamText function from the Vercel AI-SDK I noticied there was an issue which made streaming text not work.

The model is attempting to run request: http://127.0.0.1:11434/api/chat which results in:

AI_APICallError: Failed to process successful response
url: "http://127.0.0.1:11434/api/chat"

I don't why that is. Is it because it may have to hit the /generate endpoint for the streaming to worK? Anyways help would be very much appreciated it.

SklyerX commented 3 months ago

The issue was with the Bun processor there is no issue with the package.