Nutlope / twitterbio

Generate your Twitter bio with Mixtral and GPT-3.5.
https://www.twitterbio.io
MIT License
1.66k stars 465 forks source link

Sending an additional param to const payload: OpenAIStreamPayload #41

Closed kick closed 1 year ago

kick commented 1 year ago

Issue fixed by updating useState("300") to useState(300)

Aside from the prompt, I want to send over "max tokens" to the handler but it's only picking up the prompt and ignoring everything else.

generate.ts

const handler = async (req: Request): Promise<Response> => {
  const { prompt, smaxtoken } = (await req.json()) as {
    prompt?: string;
    smaxtoken?: number; <----- NOT BEING RECONIZED 
  };

  if (!prompt || !smaxtoken) {
    return new Response("No prompt in the request", { status: 400 });
  }

  const payload: OpenAIStreamPayload = {
    model: 'gpt-4',
    messages: [{ role: "user", content: prompt }],
    temperature: 0.7,
    top_p: 1,
    frequency_penalty: 0,
    presence_penalty: 0,
    max_tokens: smaxtoken, <----- NOT BEING RECONIZED 
    stream: true,
    n: 1,
  };

component.tsx

const [smaxtoken, setMmaxtoken] = useState("320");
 const generateBio = async (e: any) => {
    e.preventDefault();
    setGeneratedBios("");
    setLoading(true);
    const response = await fetch("/api/generate", {
      method: "POST",
      headers: {
        "Content-Type": "application/json",
      },
      body: JSON.stringify({
        prompt,
        smaxtoken, <--------- VALUE BEING SENT OVER
      }),
    });

Value is being sent over successfully image