And this is the error I get when I try to run inference from browser:
{"level":50,"time":1719403859826,"pid":31,"hostname":"592d634d7447","err":{"type":"BadRequestError","message":"400 status code (no body)","stack":"Error: 400 status code (no body)\n at APIError.generate (file:///app/build/server/chunks/index-3aabce5f.js:4400:20)\n at OpenAI.makeStatusError (file:///app/build/server/chunks/index-3aabce5f.js:5282:25)\n at OpenAI.makeRequest (file:///app/build/server/chunks/index-3aabce5f.js:5325:30)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async file:///app/build/server/chunks/models-e8725572.js:98846:36\n at async generateFromDefaultEndpoint (file:///app/build/server/chunks/index3-2417d430.js:213:23)\n at async generateTitle (file:///app/build/server/chunks/_server.ts-2c825ade.js:213:10)\n at async generateTitleForConversation (file:///app/build/server/chunks/_server.ts-2c825ade.js:177:19)","status":400,"headers":{"content-length":"1980","content-type":"application/json","date":"Wed, 26 Jun 2024 12:10:59 GMT","server":"uvicorn"}},"msg":"400 status code (no body)"}
BadRequestError: 400 status code (no body)
at APIError.generate (file:///app/build/server/chunks/index-3aabce5f.js:4400:20)
at OpenAI.makeStatusError (file:///app/build/server/chunks/index-3aabce5f.js:5282:25)
at OpenAI.makeRequest (file:///app/build/server/chunks/index-3aabce5f.js:5325:30)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async file:///app/build/server/chunks/models-e8725572.js:98846:36
at async generate (file:///app/build/server/chunks/_server.ts-2c825ade.js:426:30)
at async textGenerationWithoutTitle (file:///app/build/server/chunks/_server.ts-2c825ade.js:487:3) {
status: 400,
headers: {
'content-length': '543',
'content-type': 'application/json',
date: 'Wed, 26 Jun 2024 12:10:59 GMT',
server: 'uvicorn'
},
request_id: undefined,
error: undefined,
code: undefined,
param: undefined,
type: undefined
}
Is there something wrong with the .env file, or is Nvidia NIM simply not supported for some strange reason?
Hi everyone,
I have the following setup (containers are on the same device):
This is the content of the
.env
file:And this is the error I get when I try to run inference from browser:
Is there something wrong with the .env file, or is Nvidia NIM simply not supported for some strange reason?