mufeedvh / code2prompt

A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt templating, and token counting.
MIT License
1.45k stars 77 forks source link

add token count for Gemini models #18

Closed lightningRalf closed 3 months ago

lightningRalf commented 3 months ago

Hello, thanks for creating this! It looks super interesting. With Gemini having 1 million tokens available. And the possibility to get 2 million soon, it will be amazing to see the amount of tokens for them. https://ai.google.dev/gemini-api/docs/get-started/tutorial?lang=web#count-tokens

Thanks in advance for considering! Greetings from Franconia (Germany)

mufeedvh commented 3 months ago

Counting tokens for Gemini models requires an API call, they do not have their tokenizer open-sourced / locally usable like tiktoken from OpenAI. This would require the user to configure their API key and the like adding more complexity.

The token counter in code2prompt is intended to be an indicator of how long the rendered prompt is, this should be enough to gauge the possibility of usage for any LLM models/tokenizers. Closing this feature request, for now, will reconsider for the next release candidates.