CNSeniorious000 / free-chat

An elegant LLM chat UI forked from chatgpt-demo of @anse-app. Index site at https://free-chat.asia
https://endless-chat.vercel.app
MIT License
178 stars 52 forks source link

Implement token counting on server side (✓ Sandbox Passed) #20

Open sweep-ai[bot] opened 12 months ago

sweep-ai[bot] commented 12 months ago

PR Feedback: 👎

Description

This pull request implements token counting on the server side.

Summary

Fixes #17.


🎉 Latest improvements to Sweep:


💡 To get Sweep to edit this pull request, you can:

sweep-ai[bot] commented 12 months ago

Sandbox Executions

Ran GitHub Actions for 09d72442ebe25ea72693afd406fe601d703d1b27:
• Vercel Preview Comments:

Ran GitHub Actions for 30a5ea4d0bdc06e092563c96327c3e11eeb3cff2:
• Vercel Preview Comments:

vercel[bot] commented 12 months ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
44444444444 ❌ Failed (Inspect) Dec 23, 2023 2:11pm
emphasize ❌ Failed (Inspect) Dec 23, 2023 2:11pm
endless-chat ❌ Failed (Inspect) Dec 23, 2023 2:11pm
free-chat ❌ Failed (Inspect) Dec 23, 2023 2:11pm
free-chat-personal ❌ Failed (Inspect) Dec 23, 2023 2:11pm
sweep-ai[bot] commented 12 months ago

Apply Sweep Rules to your PR?

senior-dev-bot[bot] commented 12 months ago

Hi there! :wave: Thanks for opening a PR. :tada: To get the most out of Senior Dev, please sign up in our Web App, connect your GitHub account, and add/join your organization CNSeniorious000. After that, you will receive code reviews beginning on your next opened PR. :rocket:

netlify[bot] commented 12 months ago

Deploy Preview for endless-chat failed.

Name Link
Latest commit 44b8b8a284c1db023694b31530e985272d082859
Latest deploy log https://app.netlify.com/sites/endless-chat/deploys/6580a67726c50f00089cd7fb
pr-explainer-bot[bot] commented 12 months ago

Pull Request Report

Hey there! I've created a report for the pull request based on the commit history. Let's dive in!

Changes

  1. Updated src/pages/api/generate.ts:
    • Added import for countTokensServer from @/utils/tiktoken-server.
    • Added minMessages and maxTokens constants.
    • Implemented token counting and trimming logic for messages.
    • Updated initOptions with trimmed messages.
    • Added error handling for token counting and trimming logic.

Suggestions

// Add this function at the top of the file
const trimMessages = (messages: ChatMessage[], maxTokens: number, minMessages: number) => {
  let trimmedMessages = [...messages];
  let tokenCount = countTokensServer(null, trimmedMessages).total;
  while (tokenCount > maxTokens && trimmedMessages.length > minMessages) {
    trimmedMessages.shift();
    tokenCount = countTokensServer(null, trimmedMessages).total;
  }
  return trimmedMessages;
};

// Update the post function
export const post: APIRoute = async ({ request }) => {
  // ...
  const trimmedMessages = trimMessages(messages, maxTokens, minMessages);
  // ...
};

Bugs

Improvements

Rating

I would rate the code a 7 out of 10. The code is generally readable and performs well. However, there are a few areas that could be refactored for better readability, such as extracting the token counting and trimming logic into a separate function. Overall, the code seems to be secure and error handling is in place.

That's it for the pull request report! Let me know if you need any further assistance. Have a great day! 😄

netlify[bot] commented 12 months ago

Deploy Preview for chat-for-free failed.

Name Link
Latest commit 44b8b8a284c1db023694b31530e985272d082859
Latest deploy log https://app.netlify.com/sites/chat-for-free/deploys/6580a677ca656500085be314