Open raygao opened 1 year ago
Can you share some code to reproduce this issue?
Never mind. I got it working. You can close the ticket now.
The following does work. const chatCompletion = await openAI.createChatCompletion({ model: "gpt-3.5-turbo", messages: [ { "role": "system", "content": "You are a helpful assistant." }, { "role": "user", "content": "Who won the world series in 2020?" }, { "role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020.", }, { "role": "user", "content": "Where was it played?" }, ], });
console.log(chatCompletion.choices);
We should make it clearer in the docs. Thank you for raising this issue!
We should make it clearer in the docs. Thank you for raising this issue!
we might aswell make a full doc site with pyro
Not a bad idea.
Another queestion, I have a web front end calling a REST function that uses the openAI.createChatCompletion(); But, it is not blocking. as a result, I get an error - TypeError: Failed to fetch in the browser and never manage to get any content back. It is using the Fresh framework. I am calling using fetch from the frontend to the REST API, which contains the openAI function. At the same time, the same call to the same REST API from Postman works fine and without a hitch. somehow the createChatCompletion() just jumps over, despite the await or then chaining on async function. thanks
You you provide a github repo with a reproduction of the issue? It only has to include an API route.
Yes. I have a repo. https://github.com/raygao/uxig In the /api/genStory/index.tsx, I call the genStory(body), which calls the openai.CreateChatCompletion( ...) And, this is where I am having difficulties.
Also, the frontend.
In the code you provide, you return a string
from /api/genStory/index.tsx
. You can't do that, you must return a response. That's what the red squiggly line is under POST
. You should change your handler to include
return new Response(results);
instead of
return results;
Thanks a lot for the help. That hint solved the red squiggly line under the POST method. However, the browser still has a Validation issue. see screenshots. As soon as it calls the createChatCompletion, the fetch method fails. I am not understanding this.
Okay I think I found the bug. You fetch the api route using
const result = await fetch(url, theScenario);
when you should be doing
const result = await fetch(url, { body: theScenario });
Amazing how fast, you guys respond! Much appreciate for the help. Actually, the solution is even simpler. I googled just now, after struggling it for 1 full day. When I add the event.preventDefault(); to my onclick() method, the issue magically disappear. https://stackoverflow.com/questions/64619140/my-fetch-requests-are-being-canceled-and-i-dont-know-why
Nice, good to here that the bug is fixed!
Please find the result running app at: https://comfortable-hawk-99.deno.dev/ Thanks a lot for your support. Appreciate it.
Reopening so that we can improve docs, happy to hear you were able to deploy your app!
Error: invalid_request_error: This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions? This is the feedback from ChatGPT 3.5 nowadays. out-of-date