vercel / ai-chatbot

A full-featured, hackable Next.js AI chatbot built by Vercel
https://chat.vercel.ai
Other
6.24k stars 1.96k forks source link

Selecting any of the trending memecoins throws `Cannot destructure property 'role' error` #325

Closed athrael-soju closed 3 months ago

athrael-soju commented 5 months ago

@jeremyphilemon although https://github.com/vercel/ai-chatbot/pull/324 resolves the majority of issues from using streamUI over render, this one still remains.

Simply get the trending memecoins to show and select one of them. Error attached.

image

prashantbhudwal commented 5 months ago

@jeremyphilemon @jaredpalmer

the messages sent to openai are not formatted properly.

Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.

I was testing. This works.

const toolCallMessage: CoreMessage = {
  role: "assistant",
  content: [
    {
      type: "tool-call",
      toolName: "showStockPurchase",
      toolCallId: "8Bb6oJ1vAIRuHSIAVYmAp",
      args: z.object({
        symbol: z.string(),
        price: z.number(),
        defaultAmount: z.number(),
      }),
    },
  ],
};

const toolMessage: CoreMessage = {
  role: "tool",
  content: [
    {
      result: JSON.stringify({
        symbol: "AAPL",
        price: 150,
        defaultAmount: 100,
        status: "completed",
      }),
      type: "tool-result",
      toolCallId: "8Bb6oJ1vAIRuHSIAVYmAp",
      toolName: "showStockPurchase",
    },
  ],
};

  const all = [...messages, toolCallMessage, toolMessage];
athrael-soju commented 5 months ago

@prashantbhudwal Could you raise a PR please?

prashantbhudwal commented 5 months ago

@jeremyphilemon @jaredpalmer

the messages sent to openai are not formatted properly.

Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.

I was testing. This works.

const toolCallMessage: CoreMessage = {
  role: "assistant",
  content: [
    {
      type: "tool-call",
      toolName: "showStockPurchase",
      toolCallId: "8Bb6oJ1vAIRuHSIAVYmAp",
      args: z.object({
        symbol: z.string(),
        price: z.number(),
        defaultAmount: z.number(),
      }),
    },
  ],
};

const toolMessage: CoreMessage = {
  role: "tool",
  content: [
    {
      result: JSON.stringify({
        symbol: "AAPL",
        price: 150,
        defaultAmount: 100,
        status: "completed",
      }),
      type: "tool-result",
      toolCallId: "8Bb6oJ1vAIRuHSIAVYmAp",
      toolName: "showStockPurchase",
    },
  ],
};

  const all = [...messages, toolCallMessage, toolMessage];

@athrael-soju This is not a solution, just a problem description. I am still learning this. So, I don't think I am the best person for the PR.

hemik000 commented 5 months ago

Same error. Any solution?

Spectralgo commented 5 months ago

Same kind of error here with this error message: Cannot destructure property 'role' of '.for' as it is undefined. at convertToOpenAIChatMessage

Can't have the app reply after showing me a card.

image

Time for me to study some docs 📃 and try some implementations !! GLHF

UPDATE:

I've tried to check out to the previous commit and the bug disapears: image

hemik000 commented 5 months ago

Because the new SDK uses streamUI https://github.com/vercel/ai-chatbot/blob/095550d4dca22dc506cdbda815cab94cfe8fbe74/lib/chat/actions.tsx#L140

And old one uses a render https://github.com/vercel/ai-chatbot/blob/d5f736128dca6efff963fa3705f728b06f8d7927/lib/chat/actions.tsx#L144

athrael-soju commented 5 months ago

Because the new SDK uses streamUI

https://github.com/vercel/ai-chatbot/blob/095550d4dca22dc506cdbda815cab94cfe8fbe74/lib/chat/actions.tsx#L140

And old one uses a render

https://github.com/vercel/ai-chatbot/blob/d5f736128dca6efff963fa3705f728b06f8d7927/lib/chat/actions.tsx#L144

The bug appears when using streamUI, not render. If you revert to the previous commit it works fine with render.

DanielhCarranza commented 5 months ago

I have the same error after clicking on the first generated component, I update everything even nextjs but it didn't work

fullstackwebdev commented 5 months ago

Yes, this is an issue with the new code. I guess it should be reverted or fixed? I tried a few things to resolve it, but wasn't able to.

JoseAngelChepo commented 5 months ago

The latest update "streamUI instead of render #324" does not allow messages with "role: system" between conversation (only at start) with new prompt param system.

The "problem" is: The functions also uses messages with "role: system" to register the actions or events from the user in the context conversation.

The error Cannot destructure property 'role' error is consequence of the function below that return undefined for "role: system"

This function is inside of dependency "ai": "^3.1.1"

// core/prompt/convert-to-language-model-prompt.ts
function convertToLanguageModelPrompt(prompt) {
  const languageModelMessages = [];
  if (prompt.system != null) {
    languageModelMessages.push({ role: "system", content: prompt.system });
  }
  switch (prompt.type) {
    case "prompt": {
      languageModelMessages.push({
        role: "user",
        content: [{ type: "text", text: prompt.prompt }]
      });
      break;
    }
    case "messages": {
      languageModelMessages.push(
        ...prompt.messages.map((message) => {
          switch (message.role) {
            case "user": {
              if (typeof message.content === "string") {
                return {
                  role: "user",
                  content: [{ type: "text", text: message.content }]
                };
              }
              return {
                role: "user",
                content: message.content.map(
                  (part) => {
                    var _a;
                    switch (part.type) {
                      case "text": {
                        return part;
                      }
                      case "image": {
                        if (part.image instanceof URL) {
                          return {
                            type: "image",
                            image: part.image,
                            mimeType: part.mimeType
                          };
                        }
                        const imageUint8 = convertDataContentToUint8Array(
                          part.image
                        );
                        return {
                          type: "image",
                          image: imageUint8,
                          mimeType: (_a = part.mimeType) != null ? _a : detectImageMimeType(imageUint8)
                        };
                      }
                    }
                  }
                )
              };
            }
            case "assistant": {
              if (typeof message.content === "string") {
                return {
                  role: "assistant",
                  content: [{ type: "text", text: message.content }]
                };
              }
              return { role: "assistant", content: message.content };
            }
            case "tool": {
              return message;
            }
          }
        })
      );
      break;
    }
    default: {
      const _exhaustiveCheck = prompt;
      throw new Error(`Unsupported prompt type: ${_exhaustiveCheck}`);
    }
  }
  return languageModelMessages;
}

I tested in compiled module inside node_modules/ai/rsc/dist/rsc-server.msj function convertToLanguageModelPrompt by adding

case "system": {
    return message;
}

Result:

Captura de pantalla 2024-05-06 a la(s) 2 01 11 p m

This works but the correction must be made in the repository of the dependency "ai": "^3.1.1".

And this depends on whether it is considered good practice to send system messages inside of the conversation to save new context (user events)

kevb10 commented 4 months ago

The above didn't solve my problem. There is a phantom object that is undefined on my end. And that's causing the error

image

Obviously a little sanity check addresses this but I'm curious where and why an operation returns undefined in the first place and fix it there instead.

image
JoseAngelChepo commented 4 months ago

@kevb10 you can track your messages in the function streamUI locate in node_modules/ai/rsc/dist/rsc-server.mjs

Captura de pantalla 2024-05-08 a la(s) 11 23 25 a m

Check messages before and after

Captura de pantalla 2024-05-08 a la(s) 11 27 36 a m

And you can do testing with the function convertToLanguageModelPrompt(validatedPrompt) to check what message return undefined

P.S. remember that you are in the dependency

Spectralgo commented 4 months ago

Thanks @kevb10. It didn't solve my problem either, but it's nice to see progress on this issue

Update: I implemented your sanity check and it works for me.

rbenhase commented 4 months ago

@kevb10's sanity check also works for me, using patch-package to create a patch file instead of making edits inside of node_modules (this way, I won't lose my changes as soon as I run npm update or npm install). Still not an ideal solution, obviously, but good enough for the time being.

Just ran npm install patch-package postinstall-postinstall --save-dev

Followed by: npx patch-package @ai-sdk/openai

And the issue went away. Still interested in a more permanent solution, though.

prashantbhudwal commented 4 months ago

Can anyone please explain why https://chat.vercel.ai/ works perfectly fine, but the main branch on localhost does not?

Isn't the main branch from the repo deployed to this domain?

hemik000 commented 4 months ago

Maybe because that might point to some old commit.

ar-radcliff commented 4 months ago

Just noting that I'm also running into this bug and have been troubleshooting it a bit today. This affects any component that makes use of function or system role messages and happens after an unsupported message role type has been inserted into the chat history (which creates an undefined chat message).

What I've found mirrors what some of the other commenters have noted: adjusting the node_modules/ai/rsc/dist/rsc-server.msj file's convertToLanguageModelPrompt() function to handle more of the message types defined in actions.tsx (specifically 'function' and 'system') seems to eliminate the error, although it's not a good solution since it's a workaround in the dependency and I'm not clear what other effects that change might be having.

Something in the RSC framework is choking on handling message types of 'function', 'tool', 'data', and in some cases 'system' and it's causing them to become undefined messages in the chat history array, which is creating downstream problems elsewhere in the app.

image image

I'm also not clear on whether those roles are intended to be functional in the RSC or with GPT-4 and just aren't working here yet, or if the app is trying to do something it shouldn't be and using the roles incorrectly. The message role functionality is still very new to me!

hemik000 commented 4 months ago

Replacing all function roles to assistant works. And now ai SDK version ai@3.1.5 has support for system message https://github.com/vercel/ai/blob/main/packages/core/core/prompt/message.ts#L12

athrael-soju commented 4 months ago

Honestly, with all the bugs combined, you're better off using the vercel-ai-src example

jeremyphilemon commented 4 months ago

Thanks everyone for trying to debug the issue and I appreciate everyone's patience! Like you all had suspected, the messages property didn't follow the spec so the migration from render to streamUI wasn't trivial and caused errors.

337 follows the messages spec and should fix the error! I will also have an upgrade guide up in the docs soon to provide more clarity about this change and prevent any future confusion.