Closed retired-Hurt closed 2 weeks ago
Have you updated the provider to the latest version?
Have you updated the provider to the latest version?
I think I know what's happening here. The openAI provider (@ai-sdk/openai": "^0.0.9) is bringing in the dependencies "@ai-sdk/provider": "0.0.3". However the non-openAI providers are bringing in the dependencies "@ai-sdk/provider@0.0.10". So my pnpm-lock.yaml file looks like below
` '@ai-sdk/provider@0.0.10': resolution: {integrity: sha512-NzkrtREQpHID1cTqY/C4CI30PVOaXWKYytDR2EcytmFgnP7Z6+CrGIA/YCnNhYAuUm6Nx+nGpRL/Hmyrv7NYzg==} engines: {node: '>=18'}
'@ai-sdk/provider@0.0.3': resolution: {integrity: sha512-0B8P6VZpJ6F9yS9BpmJBYSqIaIfeRtL5tD5SP+qgR8y0pPwalIbRMUFiLz9YUT6g70MJsCLpm/2/fX3cfAYCJw==} engines: {node: '>=18'}`
And somehow these 2 are conflicting inside vscode. Have to see how to bypass the same
Can you update the openai provider to ^0.0.31
?
Can you update the openai provider to
^0.0.31
?
Here is how I fixed it. I removed all node_modules, did fresh installations on all the modules, updated openai to ^4.51.0 and things seem to be working fine even with openai provider at ^0.0.9.
But now, I'm getting a different error on the line const uiState = getUIStateFromAIState(aiState)
with the error as 'Argument of type 'Readonly
Closing the issue as getting the right modules helped fix the error
I'm hitting this same issue. How did you resolve it exactly? Thanks
Is the above fix not working? Try with the following
@ai-sdk/openai": "^0.0.9 ai": "^3.2.0", Do not install openai
Running npm update on ai and ai-sdk fixed it for me. Thanks
Description
All non-OpenAI providers are giving error like Type 'GoogleGenerativeAILanguageModel' is not assignable to type 'LanguageModelV1'...Steps to reproduce
Clone the repo (https://github.com/vercel/ai-chatbot) to local On lib --> chat --> actions.tsx write following code import { google } from '@ai-sdk/google';
Replace the code model: openai('gpt-3.5-turbo'), with the code google('models/gemini-pro'), and it shows the above compilation error on vscode. I have tried this with other providers like mistral, anthropic and all of them give the same error. However the groq provider doesn't give the same error as it uses the OpenAI provider.
Is there an underlying problem with the SDK or is there something I am missing here?
Code example
import 'server-only'
import { createAI, createStreamableUI, getMutableAIState, getAIState, streamUI, createStreamableValue } from 'ai/rsc' import { openai } from '@ai-sdk/openai' import { vertex } from '@ai-sdk/google-vertex';
import { spinner, BotCard, BotMessage, SystemMessage, Stock, Purchase } from '@/components/stocks'
import { z } from 'zod' import { EventsSkeleton } from '@/components/stocks/events-skeleton' import { Events } from '@/components/stocks/events' import { StocksSkeleton } from '@/components/stocks/stocks-skeleton' import { Stocks } from '@/components/stocks/stocks' import { StockSkeleton } from '@/components/stocks/stock-skeleton' import { formatNumber, runAsyncFnWithoutBlocking, sleep, nanoid } from '@/lib/utils' import { saveChat } from '@/app/actions' import { SpinnerMessage, UserMessage } from '@/components/stocks/message' import { Chat, Message } from '@/lib/types' import { auth } from '@/auth' import { streamText } from 'ai';
async function confirmPurchase(symbol: string, price: number, amount: number) { 'use server'
const aiState = getMutableAIState()
const purchasing = createStreamableUI(
Purchasing {amount} ${symbol}...
)
const systemMessage = createStreamableUI(null)
runAsyncFnWithoutBlocking(async () => { await sleep(1000)
})
return { purchasingUI: purchasing.value, newMessage: { id: nanoid(), display: systemMessage.value } } }
async function submitUserMessage(content: string) { 'use server'
const aiState = getMutableAIState()
aiState.update({ ...aiState.get(), messages: [ ...aiState.get().messages, { id: nanoid(), role: 'user', content } ] })
let textStream: undefined | ReturnType<typeof createStreamableValue>
let textNode: undefined | React.ReactNode
const result = await streamUI({ model: openai('gpt-3.5-turbo'), //model: vertex('models/gemini-pro'), initial: ,
system:
}
Hi
, messages: [ ...aiState.get().messages.map((message: any) => ({ role: message.role, content: message.content, name: message.name })) ], text: ({ content, done, delta }) => { if (!textStream) { textStream = createStreamableValue('') textNode =},)
return { id: nanoid(), display: result.value } }
export type AIState = { chatId: string messages: Message[] }
export type UIState = { id: string display: React.ReactNode }[]
export const AI = createAI<AIState, UIState>({ actions: { submitUserMessage, confirmPurchase }, initialUIState: [], initialAIState: { chatId: nanoid(), messages: [] }, onGetUIState: async () => { 'use server'
}, onSetAIState: async ({ state }) => { 'use server'
} })
export const getUIStateFromAIState = (aiState: Chat) => { return aiState.messages .filter(message => message.role !== 'system') .map((message, index) => ({ id:
${aiState.chatId}-${index}
, display: message.role === 'tool' ? ( message.content.map(tool => { return tool.toolName === 'listStocks' ? (}
Additional context
I suspect this must be something related to the underlying framework as all I am doing is replacing the openAI provider with google provider of Vercel