vercel / ai

Build AI-powered applications with React, Svelte, Vue, and Solid
https://sdk.vercel.ai/docs
Other
10.08k stars 1.5k forks source link

Why it's generating wrong answer how to set correct input? #855

Closed xts-bit closed 11 months ago

xts-bit commented 11 months ago

Description

I'm using AI SDK to use together AI API using OpenAI SDK which supports this, I tried this in Node.js and it works, However, when I try to do that in Next.js using Vercel AI SDK it's not working as expected, if I prompt it "What is Vercel" in the input in the frontend it says "I'm helpful assistant" How can i do things correctly?

Code example

import OpenAI from 'openai';
import { OpenAIStream, StreamingTextResponse } from 'ai';

const openai = new OpenAI({
    apiKey: process.env.OPENAI_API_KEY,
    baseURL: "https://api.together.xyz/v1"
});

export const runtime = 'edge';

export async function POST(req: Request) {
    const { messages, input } = await req.json();
    console.log(messages)
    try {
        const response = await openai.chat.completions.create({
            messages: [
                { role: "system", content: "You are an AI assistant" },
                { role: "user", content: input },
            ],
            model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
            max_tokens: 12,
            stream: true
        });

        const stream = OpenAIStream(response);
        return new StreamingTextResponse(stream);
    } catch (error) {
        console.log(error)
    }
}
'use client';

import { useChat } from 'ai/react';

export default function Chat() {

  const { messages, input, handleInputChange, handleSubmit } = useChat();
  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.map(m => (
        <div key={m.id} className="whitespace-pre-wrap">
          {m.role === 'user' ? 'User: ' : 'AI: '}
          {m.content}
        </div>
      ))}

      <form onSubmit={handleSubmit}>
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={handleInputChange}
        />
      </form>
    </div>
  );
}

Additional context

How can i get correct response that make sense according to the input?

xts-bit commented 11 months ago

@jaredpalmer @MaxLeiter Any idea about this issue?

MaxLeiter commented 11 months ago

This isn't AI SDK related, you may want to experiment with a more complete system prompt for mixtral, or use one that's been trained for chat.

xts-bit commented 11 months ago

@MaxLeiter But can you please tell me how I give a system prompt? if I'm doing like this:

export async function POST(req: Request) {
    const { messages } = await req.json();
    console.log(messages)
    try {
        const response = await openai.chat.completions.create({
            messages: messages,
            model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
            max_tokens: 12,
            stream: true
        });

        const stream = OpenAIStream(response);
        return new StreamingTextResponse(stream);
    } catch (error) {
        console.log(error)
    }
}
xts-bit commented 11 months ago

@lgrammel How do I set a system prompt to the AI?

Currently, i am doing something like [...messages, SYSTEM_PROMPT] to make it work, But unfortunately, it's not working as expected.

Can you please help me out?

lgrammel commented 11 months ago

@xts-bit Check out the OpenAI documentation for their SDK: https://platform.openai.com/docs/guides/text-generation/chat-completions-api

xts-bit commented 11 months ago

@lgrammel It doesn't work like that i did messages: [ { "role": "system", "content": "You are a helpful assistant." }, ...messages] it gives error index.mjs:288

   POST http://localhost:3000/api/chat 404 (Not Found)