when we count tokens will should serialize text first to avoid this problem (when 15813 < 14976):
2024-05-10T21:03:15.798876Z INFO refact_lsp::scratchpads::chat_utils_rag:551: 3741 lines in 1 files => tokens 13936 < 13943 tokens limit => 1384 lines in 1 files
2024-05-10T21:03:15.799386Z INFO refact_lsp::scratchpads::chat_utils_rag:602: file "/Users/valaises/RustroverProjects/refact-lsp/src/scratchpads/chat_utils_rag.rs":1-3740
2024-05-10T21:03:15.800649Z INFO refact_lsp::scratchpads::chat_utils_rag:703: postprocess_at_results2 0.194s
2024-05-10T21:03:15.801713Z INFO refact_lsp::scratchpads::chat_utils_limit_history:15: limit_messages_history tokens_limit=14976 because context_size=16000 and max_new_tokens=1024
2024-05-10T21:03:15.923983Z INFO refact_lsp::scratchpads::chat_utils_limit_history:50: not allowed to drop "[{\"file_name\":\"/Users/valaises...", tokens_used=15813 < 14976
2024-05-10T21:03:15.924019Z INFO refact_lsp::scratchpads::chat_utils_limit_history:50: not allowed to drop "you are a code assistant", tokens_used=15813 < 14976
when we count tokens will should serialize text first to avoid this problem (when 15813 < 14976):