lowspace / chatgpt-token-counter

Token Counter Chrome Extension for ChatGPT: A handy tool designed to monitor and display the token usage of ChatGPT sessions directly in your browser, helping you manage and optimize interactions efficiently.
MIT License
0 stars 0 forks source link

How can CPU consumption be reduced? #2

Closed lowspace closed 3 months ago

lowspace commented 3 months ago

While engaging in new and long conversations, CPU consumption can rise to about 40%. How can this be optimized?

The naive plan here is to use storage to avoid further duplication computation.

lowspace commented 3 months ago

The workflow in here is:

flowchart TD
    A[Start] --> B[Extract data-message-id]
    A --> C[Extract current model slug]
    B --> D[Check if data-message-id is in storage]
    C --> D
    D -- No --> E[Combine data-message-id with current model slug and run token counter]
    D -- Yes --> F[Check if the current model slug was used for token counting]
    F -- No --> E
    F -- Yes --> G[Read token counts from storage]

Note:

  1. Tokenizer depends on the model type, such as gpt4 and gpt3.5 share the same tokenizer while gpt4o does not.
  2. In this way, only read and parse the HTML content once => reduce CPU consumption.