Open gururise opened 1 month ago
Same thing happens on the new Llama 3.1
It appears to just be the Llama models from together.ai that are failing. I am able to use deepseek-33b from Together.ai w/o issue.
Uh that's strange. Can you get the models to work outside chat-ui (in that case it's an issue with us) or do you think the models just don't work/don't follow the spec?
Yes models work fine outside chat-ui, including api calls. Here is some more debug info:
56535/04/25 01:15PM 50 pid=22 hostname=8e21ba66015c err={"type":"InternalServerError","message":"500 (unknown path)\n TypeError: str.replace is not a function","stack":"Error: 500 (unknown path)\n TypeError: str.replace is not a function\n at APIError.generate (file:///app/build/server/chunks/index-5eb7da60.js:4421:20)\n at OpenAI.makeStatusError (file:///app/build/server/chunks/index-5eb7da60.js:5282:25)\n at OpenAI.makeRequest (file:///app/build/server/chunks/index-5eb7da60.js:5325:30)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async file:///app/build/server/chunks/models-7f8c2c80.js:100201:36\n at async generateFromDefaultEndpoint (file:///app/build/server/chunks/index3-e0ccc6aa.js:213:23)\n at async generateTitle (file:///app/build/server/chunks/_server.ts-2923e711.js:213:10)\n at async generateTitleForConversation (file:///app/build/server/chunks/_server.ts-2923e711.js:177:19)","status":500,"headers":{"access-control-allow-origin":"*","alt-svc":"h3=\":443\"; ma=86400","cf-cache-status":"DYNAMIC","cf-ray":"8a8c943b1d224976-MIA","connection":"keep-alive","content-length":"129","content-type":"application/json; charset=utf-8","date":"Thu, 25 Jul 2024 13:43:27 GMT","etag":"\"t53xzkqk453l\"","server":"cloudflare","strict-transport-security":"max-age=2592000; includeSubDomains","vary":"Accept-Encoding","x-api-received":"2024-07-25T13:43:27.281Z","x-inference-version":"v2","x-ratelimit-limit":"20","x-ratelimit-remaining":"15","x-ratelimit-reset":"1"},"error":{"type":"Object","message":"(unknown path)\n TypeError: str.replace is not a function","stack":"","param":null,"code":null},"code":null,"param":null} msg=500 (unknown path)
TypeError: str.replace is not a function
InternalServerError: 500 (unknown path)
TypeError: str.replace is not a function
at APIError.generate (file:///app/build/server/chunks/index-5eb7da60.js:4421:20)
at OpenAI.makeStatusError (file:///app/build/server/chunks/index-5eb7da60.js:5282:25)
at OpenAI.makeRequest (file:///app/build/server/chunks/index-5eb7da60.js:5325:30)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async file:///app/build/server/chunks/models-7f8c2c80.js:100201:36
at async generate (file:///app/build/server/chunks/_server.ts-2923e711.js:423:30)
at async textGenerationWithoutTitle (file:///app/build/server/chunks/_server.ts-2923e711.js:484:3) {
status: 500,
I'm not sure what the issue is.
This config used to work on the older hugging chat 0.8.2
All my other models (OpenAI, Anthropic) work fine, its just the Llama-3-70b from Together that fails.