Cainier / gpt-tokens

Calculate the token consumption and amount of openai gpt message
MIT License
104 stars 13 forks source link

there exists memeory leak in this package #6

Closed assmdx closed 1 year ago

assmdx commented 1 year ago
image image

I use deepClone to make a new msg object, however memory leak still exits.

assmdx commented 1 year ago

https://pullanswer.com/questions/always-meet-error-from-atdqbd-tiktoken-and-cause-memory-leak

RuslanMikailov commented 1 year ago

+1

assmdx commented 1 year ago

maybe here should invoke encoding.free() everytime in for loop

for (const [key, value] of Object.entries(message)) {
                num_tokens += encoding.encode(value).length;
                if (key === 'name') {
                    num_tokens += tokens_per_name;
                }
            }
Cainier commented 1 year ago

Ok, thanks for the bug feedback

I fixed this bug in v1.0.7