Nutlope / twitterbio

Generate your Twitter bio with Mixtral and GPT-3.5.
https://www.twitterbio.io
MIT License
1.66k stars 465 forks source link

Generating twitter bio of previously given pasted bio not current bio. #52

Closed digimark2023 closed 9 months ago

digimark2023 commented 1 year ago

Hi @Nutlope ,

Initiallly, when I entered the text "Mechanical engineer with 7 years of experience..." and chose the "Professional" option, I found that the Twitter bio displayed was that of a digital marketer consistently.

Subsequently, when I input the text "Historian with 10 years of experience..." and selected the "Professional" option, the Twitter bio being shown was that of the Mechanical engineer with 7 years of experience, which was the bio I had provided earlier.

It seems that regardless of the bio I initially input, the generated Twitter bio is consistently related to digital marketing. From the second attempt onwards, it continues to display the previously generated bio.

I would greatly appreciate any assistance in resolving this issue. Thank you in advance!

TornatoreTristan commented 11 months ago

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message.

const body = await req.json() console.log(body)

you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content.

This fixes the issue while we look for a cleaner solution. Good luck!

wcgordon1 commented 10 months ago

I can't figure this out either. Still getting the previous input's openAI response.

I noticed in TwitterBio.io the text in the BIO input remains after you click generate. Scroll to bottom doesn't work either.

wcgordon1 commented 10 months ago

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message.

const body = await req.json() console.log(body)

you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content.

This fixes the issue while we look for a cleaner solution. Good luck!

Where is this in the code? they may have updated to the Vercel AI SDK since your post. Thanks!

digimark2023 commented 9 months ago

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message. const body = await req.json() console.log(body) you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content. This fixes the issue while we look for a cleaner solution. Good luck!

Where is this in the code? they may have updated to the Vercel AI SDK since your post. Thanks!

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message.

const body = await req.json() console.log(body)

you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content.

This fixes the issue while we look for a cleaner solution. Good luck!

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message.

const body = await req.json() console.log(body)

you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content.

This fixes the issue while we look for a cleaner solution. Good luck!

Hi @TornatoreTristan , Thanks for reply. But I am not able to find the code , where I need to change. Please advice with file name and existing code. please! thanks in advance!

gltjk commented 9 months ago

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message. const body = await req.json() console.log(body) you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content. This fixes the issue while we look for a cleaner solution. Good luck!

Where is this in the code? they may have updated to the Vercel AI SDK since your post. Thanks!

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message. const body = await req.json() console.log(body) you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content. This fixes the issue while we look for a cleaner solution. Good luck!

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message. const body = await req.json() console.log(body) you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content. This fixes the issue while we look for a cleaner solution. Good luck!

Hi @TornatoreTristan , Thanks for reply. But I am not able to find the code , where I need to change. Please advice with file name and existing code. please! thanks in advance!

Here is the code. Instead of using const { vibe, bio } = await req.json();, you can get the messages out by using const { vibe, messages } = await req.json(); Then you will find the correct bio in the last element of that messages as its content.

Here is the full code of function POST:

export async function POST(req: Request) {
  const { vibe, messages } = await req.json();
  const bio = messages.at(-1).content;

  // Ask OpenAI for a streaming completion given the prompt
  const response = await openai.createChatCompletion({
    model: 'gpt-3.5-turbo',
    stream: true,
    messages: [
      {
        role: 'user',
        content: `Generate 2 ${vibe} twitter biographies with no hashtags and clearly labeled "1." and "2.". ${
          vibe === 'Funny'
            ? "Make sure there is a joke in there and it's a little ridiculous."
            : null
        }
          Make sure each generated biography is less than 160 characters, has short sentences that are found in Twitter bios, and base them on this context: ${bio}${
          bio.slice(-1) === '.' ? '' : '.'
        }`,
      },
    ],
  });

  // Convert the response into a friendly text-stream
  const stream = OpenAIStream(response);
  // Respond with the stream
  return new StreamingTextResponse(stream);
}
digimark2023 commented 9 months ago

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message. const body = await req.json() console.log(body) you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content. This fixes the issue while we look for a cleaner solution. Good luck!

Where is this in the code? they may have updated to the Vercel AI SDK since your post. Thanks!

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message. const body = await req.json() console.log(body) you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content. This fixes the issue while we look for a cleaner solution. Good luck!

I had the same issue. Instead of getting the bio from the body object, I retrieve the content from the user message. const body = await req.json() console.log(body) you can see that it's indeed the last message that comes through. And to get the last user message, I use [body.messages.length -1].content. This fixes the issue while we look for a cleaner solution. Good luck!

Hi @TornatoreTristan , Thanks for reply. But I am not able to find the code , where I need to change. Please advice with file name and existing code. please! thanks in advance!

Here is the code. Instead of using const { vibe, bio } = await req.json();, you can get the messages out by using const { vibe, messages } = await req.json(); Then you will find the correct bio in the last element of that messages as its content.

Here is the full code of function POST:

export async function POST(req: Request) {
  const { vibe, messages } = await req.json();
  const bio = messages.at(-1).content;

  // Ask OpenAI for a streaming completion given the prompt
  const response = await openai.createChatCompletion({
    model: 'gpt-3.5-turbo',
    stream: true,
    messages: [
      {
        role: 'user',
        content: `Generate 2 ${vibe} twitter biographies with no hashtags and clearly labeled "1." and "2.". ${
          vibe === 'Funny'
            ? "Make sure there is a joke in there and it's a little ridiculous."
            : null
        }
          Make sure each generated biography is less than 160 characters, has short sentences that are found in Twitter bios, and base them on this context: ${bio}${
          bio.slice(-1) === '.' ? '' : '.'
        }`,
      },
    ],
  });

  // Convert the response into a friendly text-stream
  const stream = OpenAIStream(response);
  // Respond with the stream
  return new StreamingTextResponse(stream);
}

@gltjk Thanks you so much. Its working as excpected. :)