Closed saliksik closed 1 year ago
In case it may be helpful, I have also used the Nutlope twitter bio template, and you can check out how I implemented the tokenizer at https://github.com/llegomark/betterreadings.
To implement the tokenizer, I used the following code:
const tokenizer = new gpt3Tokenizer({ type: 'gpt3' });
const tokens = tokenizer.encode(prompt);
const numTokens = tokens.bpe.length;
const MAX_PROMPT_TOKENS = 400;
if (numTokens > MAX_PROMPT_TOKENS) {
return new Response(`The prompt has ${numTokens} tokens, which exceeds the maximum limit of ${MAX_PROMPT_TOKENS} tokens.`, {
status: 400,
statusText: "Bad Request",
});
}
I hope this information is helpful to you. Let me know if you have any further questions. Thank you!
Hello,
I've been attempting to add the GPT3-Tokenizer, but I've been unable to get it to function properly. At present, I am utilizing the template provided by Nutlope for my Twitter Bio, which can be found at .
Furthermore, I am in the process of learning React and NextJS.
Thank you!