SychAndrii / infusion

A command line tool designed to help you generate documentation for your source code using LLMs.
MIT License
0 stars 2 forks source link

Feat: Chat completion token count #20

Closed aamfahim closed 1 month ago

aamfahim commented 2 months ago

Description

I think a missing feature that would be great addition a flag which tells the user how many tokens were sent in the prompt, and how many tokens were returned in the chat completion.

It can be something like -u & --token-usage

Usage

It can be used as such:

python -m src.app --token-usage example/input/DataProcessor.TS

or

python -m src.app -u example/input/DataProcessor.TS

And the results would be right after the output the following would be added.

Prompt token count: NUMBER_OF_TOKENS Completion token count: NUMBER_OF_TOKENS