Closed PeterTongHu closed 3 months ago
I wonder if the llm is giving up and not sending back a full response - you could try catching the error in llm.ts
and printing what the response was - maybe await response.text()
would return something? My guess is this is running against local ollama?
Here is the error: Generating messages keep timing out, and there has been Unexpected end of JSON input error.