Open WindEnWonder opened 2 months ago
Hey, Thanks so much!! Do you have an approximate date when support will be added for Gemini? In the mean time, what models are actually capable of experimental_streamObject and experimental_streamText (other than OpenAI). Are there any free LLMs such as Gemini that can do experimental_streamObject and experimental_streamText? Again, thank you so much for your time and assistance!!
Do you have an approximate date when support will be added for Gemini?
i'm in the same boat as you mate, wanted to try gemini and got a similar error 😅
Are there any free LLMs
i imagine you can use llms with an openai compatible api might be worth testing with ollama
Gemini does not support forced function calling yet (streamObject
). You can use it with tools using streamText
( https://sdk.vercel.ai/docs/ai-core/stream-text ), see e.g. https://github.com/vercel/ai/pull/1212 for an RSC example.
Any updates on this? Looks like gemini 1.5 preview supports object out now? @lgrammel
This is also a question I had, @jsonkcli. However, when I tried a segment of code utilizing the streamObject
from the Vercel AI SDK to determine if gemini-1.5-pro-latest
was capable of object generation, the unhandledRejection: Error: Model does not have a default object generation mode
was thrown. I assume gemini-1.5-pro-latest
does not have support for objects yet?
I got the same error even though im using openAI
I would ensure that you are using the most recent update of Vercel AI SDK and correctly importing OpenAI as per the documentation for experimental generation, etc. In addition, maybe try to make sure that you're using a compatible model like gpt-3.5-turbo, gpt-4-turbo, or others and not a deprecated model like text-davinci.
Description
Hello, Instead of using OpenAI's keys in Morphic project, a template on Vercel, I wanted to experiment with Gemini (Google's LLM). I wanted to use a free LLM rather than OpenAI, but still have the same quality. I am using the AI SDK. I initialized the google provider correctly. According to the Vercel AI SDK, Google is a valid provider and can "create...language model objects that can be used with the generateText, streamText, generateObject, and streamObject AI functions". Despite this, upon integration, an error comes up for the model not having a default generation mode. I am rather sure that I used Gemini and the Vercel AI SDK correctly so what is the issue? Can someone help me out and point this issue out and a solution? I created an issue with Morphic and the owner, @miurla, said that it would be better to create an issue with the SDK as it is the generate object/streamObject that doesn't work but the streamText works. Under further debugging, I have determined that this is the case. Can you add support for the streamObject in Gemini, or point out the issue. This is my sample code for app/lib/agents/inquire.tsx. Thank you so much for your time and support! I really appreciate it!!! Have a wonderful day!!:
Code example
import { Copilot } from '@/components/copilot' import { createStreamableUI, createStreamableValue } from 'ai/rsc' import { ExperimentalMessage, experimental_streamObject } from 'ai' import { PartialInquiry, inquirySchema } from '@/lib/schema/inquiry' //import { google } from 'ai/google' import { Google } from '@ai-sdk/google'
const google = new Google({ baseUrl: '', apiKey: process.env.GOOGLE_GENERATIVE_AI_API_KEY })
export async function inquire( uiStream: ReturnType, messages: ExperimentalMessage[] ) {
const objectStream = createStreamableValue() uiStream.update()
let finalInquiry: PartialInquiry = {} await experimental_streamObject({ //model: openai.chat('gpt-4-turbo-preview'), model: google.generativeAI('models/gemini-pro'), //rest of code not shown`
//From this code above, this error is shown in my console in VS Code when the website //is used in development in VSCode:
Error: Model does not have a default object generation mode. at experimental_generateObject (webpack-internal:///(rsc)/./node_modules/ai/dist/index.mjs:614:13) at taskManager (webpack-internal:///(rsc)/./lib/agents/task-manager.tsx:27:89) at processEvents (webpack-internal:///(rsc)/./app/action.tsx:55:91) at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:104:5) at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22%2FUsers%2FKarthik%2FDownloads%2Fmorphic%2Fapp%2Faction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22%2FUsers%2FKarthik%2FDownloads%2Fmorphic%2Fnode_modules%2Fai%2Frsc%2Fdist%2Frsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported__=!:9:17) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1138:24) at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1134:12) at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:316:31) at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:245:9) ⨯ unhandledRejection: Error: Model does not have a default object generation mode. at experimental_generateObject (webpack-internal:///(rsc)/./node_modules/ai/dist/index.mjs:614:13) at taskManager (webpack-internal:///(rsc)/./lib/agents/task-manager.tsx:27:89) at processEvents (webpack-internal:///(rsc)/./app/action.tsx:55:91) at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:104:5) at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22%2FUsers%2FKarthik%2FDownloads%2Fmorphic%2Fapp%2Faction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22%2FUsers%2FKarthik%2FDownloads%2Fmorphic%2Fnode_modules%2Fai%2Frsc%2Fdist%2Frsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1138:24) at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1134:12) at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:316:31) at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:245:9) ⨯ unhandledRejection: Error: Model does not have a default object generation mode. at experimental_generateObject (webpack-internal:///(rsc)/./node_modules/ai/dist/index.mjs:614:13) at taskManager (webpack-internal:///(rsc)/./lib/agents/task-manager.tsx:27:89) at processEvents (webpack-internal:///(rsc)/./app/action.tsx:55:91) at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:104:5) at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22%2FUsers%2FKarthik%2FDownloads%2Fmorphic%2Fapp%2Faction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22%2FUsers%2FKarthik%2FDownloads%2Fmorphic%2Fnode_modules%2Fai%2Frsc%2Fdist%2Frsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&__client_imported=!:9:17) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1138:24) at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1134:12) at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:316:31) at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:245:9) The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call .done(). The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call .done().
Additional context
I opened an issue with Mirula on Morphic and they suggested that I open an issue with the Vercel AI SDK as streamObject doesn't work but streamText does work with the Gemini model.