Open BintzGavin opened 3 months ago
Can't you just use whatever model you want at your api route handler?
const response = await streamText({
model: openai.chat("gpt-4o-mini"),
});
return response.toTextStreamResponse();
Yeah, exactly:
how do I get the AI autocomplete functionality? The demo version doesn't seem to have it. What am I missing?
Describe the feature you'd like to request
I'd like to recommend that this package update its current language model from GPT-3.5-turbo to the newer GPT-4o-mini.
Describe the solution you'd like to see
Updating the model to GPT-4o-mini would provide several benefits:
Additional information
No response