Closed Iven2132 closed 5 months ago
I encountered the same issue after migration to 3.1.x. I'm not sure if it's just gotten more strict and is no longer facilitating existing bad practices in my code, if I migrated wrong, or if this is just a bug.
My migration was as simple as the very straightforward swap from render => streamUI, unstable_onGet/SetState in AIProvider, and I tried both direct { openai } and createProvider from the ai-sdk for the new provider.
For me, the issue appears even when just returning a text response with nothing else provided as an option (no tool calls, etc).
Loading chats seems to work for text responses, but initial send throws error after textStream finishes according to server console.
To be clear, it's the same error as @Iven2132 . 'Only plain objects can be passed @ stringify
Like I'm sure Iven is, happy to test out any suggestions. If I get time I will put together a minimal reproduction and hopefully shed additional light.
@Iven2132 @shaded-blue which next.js versions are you using? I just tested the example on the latest next js and it worked for me.
@Iven2132 @shaded-blue which next.js versions are you using? I just tested the example on the latest next js and it worked for me.
Using Next.js 14.2.3 and AI SDK 3.1.1 and @ai-sdk/openai 0.0.9
@Iven2132 hm I'm using the same. Can you share your layout.tsx
content as well?
@Iven2132 hm I'm using the same. Can you share your
layout.tsx
content as well?
Yes:
import type { Metadata } from "next";
import { Inter } from "next/font/google";
import "./globals.css";
const inter = Inter({ subsets: ["latin"] });
export const metadata: Metadata = {
title: "Create Next App",
description: "Generated by create next app",
};
export default function RootLayout({
children,
}: Readonly<{
children: React.ReactNode;
}>) {
return (
<html lang="en">
<body className={inter.className}>{children}</body>
</html>
);
}
Strange. Here is my working setup:
actions.tsx
'use server';
import { createStreamableValue } from 'ai/rsc';
import { CoreMessage, streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
export async function continueConversation(messages: CoreMessage[]) {
const result = await streamText({
model: openai('gpt-4-turbo'),
messages,
});
const stream = createStreamableValue(result.textStream);
return stream.value;
}
layout.tsx
import type { Metadata } from 'next';
import { Inter } from 'next/font/google';
import './globals.css';
const inter = Inter({ subsets: ['latin'] });
export const metadata: Metadata = {
title: 'Create Next App',
description: 'Generated by create next app',
};
export default function RootLayout({
children,
}: Readonly<{
children: React.ReactNode;
}>) {
return (
<html lang="en">
<body className={inter.className}>{children}</body>
</html>
);
}
page.tsx
'use client';
import { type CoreMessage } from 'ai';
import { useState } from 'react';
import { continueConversation } from './actions';
import { readStreamableValue } from 'ai/rsc';
export default function Chat() {
const [messages, setMessages] = useState<CoreMessage[]>([]);
const [input, setInput] = useState('');
return (
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
{messages.map((m, i) => (
<div key={i} className="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.content as string}
</div>
))}
<form
action={async () => {
const newMessages: CoreMessage[] = [
...messages,
{ content: input, role: 'user' },
];
setMessages(newMessages);
setInput('');
const result = await continueConversation(newMessages);
for await (const content of readStreamableValue(result)) {
setMessages([
...newMessages,
{
role: 'assistant',
content: content as string,
},
]);
}
}}
>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={e => setInput(e.target.value)}
/>
</form>
</div>
);
}
next: 14.2.3 react: 18.2.0 ai-sdk: 3.1.1
can you try the above? it should be very similar to what you have.
Strange. Here is my working setup:
actions.tsx
'use server'; import { createStreamableValue } from 'ai/rsc'; import { CoreMessage, streamText } from 'ai'; import { openai } from '@ai-sdk/openai'; export async function continueConversation(messages: CoreMessage[]) { const result = await streamText({ model: openai('gpt-4-turbo'), messages, }); const stream = createStreamableValue(result.textStream); return stream.value; }
layout.tsx
import type { Metadata } from 'next'; import { Inter } from 'next/font/google'; import './globals.css'; const inter = Inter({ subsets: ['latin'] }); export const metadata: Metadata = { title: 'Create Next App', description: 'Generated by create next app', }; export default function RootLayout({ children, }: Readonly<{ children: React.ReactNode; }>) { return ( <html lang="en"> <body className={inter.className}>{children}</body> </html> ); }
page.tsx
'use client'; import { type CoreMessage } from 'ai'; import { useState } from 'react'; import { continueConversation } from './actions'; import { readStreamableValue } from 'ai/rsc'; export default function Chat() { const [messages, setMessages] = useState<CoreMessage[]>([]); const [input, setInput] = useState(''); return ( <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch"> {messages.map((m, i) => ( <div key={i} className="whitespace-pre-wrap"> {m.role === 'user' ? 'User: ' : 'AI: '} {m.content as string} </div> ))} <form action={async () => { const newMessages: CoreMessage[] = [ ...messages, { content: input, role: 'user' }, ]; setMessages(newMessages); setInput(''); const result = await continueConversation(newMessages); for await (const content of readStreamableValue(result)) { setMessages([ ...newMessages, { role: 'assistant', content: content as string, }, ]); } }} > <input className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl" value={input} placeholder="Say something..." onChange={e => setInput(e.target.value)} /> </form> </div> ); }
next: 14.2.3 react: 18.2.0 ai-sdk: 3.1.1
can you try the above? it should be very similar to what you have.
Can you share your project? like a GitHub repo?
i've modified examples/next-ai-rsc
locally with the changes above
Here's the branch: https://github.com/vercel/ai/tree/lg/issue-1501/examples/next-ai-rsc
Here's the branch: https://github.com/vercel/ai/tree/lg/issue-1501/examples/next-ai-rsc
the error only comes when I use mistralai/Mixtral-8x7B-Instruct-v0.1 by together.ai, the code works fine with gpt-3.5.
Can you please try to use models by together ai?
@lgrammel let me know if you can reproduce it
Could be unrelated, but I was running into this while creating an Azure OpenAI provider. The first chunk of the stream was failing zod validation which triggered this code:
if (!chunk.success) {
controller.enqueue({ type: "error", error: chunk.error });
return;
}
Sending over the full error resulted in the Only plain objects can be passed to Client Components from Server Components
. I changed chunk.error
to chunk.error.message
and the above warning was resolved.
@patrick-moore interesting, this could be related.
@Iven2132 i don't have together.ai access. can you reproduce with another provider such as fireworks or groq?
So I was able to reproduce this with the Together API and there are a couple things that happen that causes the failure.
Together API uses 'eos'
as finish_reason
to mark the end of a text generation, however, the OpenAI provider that you're proxying the endpoint with only supports 'stop'
to mark the end of the stream, so the there is a schema mismatch when it encounters 'eos'
and throws an error.
Now this error is returned as an Error
object as part of the stream {error: Error}
, which is not a supported type that can be passed from a server action to a client component, hence why you're seeing the Only plain objects can be passed to Client Components from Server Components
message.
There are two ways to fix this:
Create a custom provider for Together API that supports their unique implementation details so 'eos'
is considered a valid stop sequence.
Add a try...catch
block when you're reading from the stream and handle the error quietly.
try {
const result = await continueConversation(newMessages);
for await (const content of readStreamableValue(result)) {
setMessages([
...newMessages,
{
role: "assistant",
content: content as string,
},
]);
}
} catch (error) {
console.error(error);
}
I'm relaxing the Zod checking here: https://github.com/vercel/ai/pull/1835/files#diff-acdaca42c343ab5155c2f7f101611d05fff503a09e2c4ddf3992beffafa09f39 so it might work with upcoming openai provider versions
Description
The server-action example on AI SDK docs gives me an error Warning: Only plain objects can be passed to Client Components from Server Components
Code example
Additional context
using latest version of AI SDK 3.1.1