Code2Prompt is a powerful command-line tool that simplifies the process of providing context to Large Language Models (LLMs) by generating a comprehensive Markdown file containing the content of your codebase. ⭐ If you find Code2Prompt useful, consider giving us a star on GitHub! It helps us reach more developers and improve the tool. ⭐
I think the token count was once printed at the end where it's more useful. If it's in the beginning of the output, it cannot ever be seen with a large code base, because the usual use case is to copy a large set of data, and only get the token count at the end.
As a user, I want to see the token count immediately and not have to scroll back to the top.
Describe the bug
Using the
--tokens
option does not provide tokens in the output at the end but in the beginning.To Reproduce Steps to reproduce the behavior:
Expected behavior
I think the token count was once printed at the end where it's more useful. If it's in the beginning of the output, it cannot ever be seen with a large code base, because the usual use case is to copy a large set of data, and only get the token count at the end.
As a user, I want to see the token count immediately and not have to scroll back to the top.
Version
0.6.13