Closed peilongchencc closed 1 month ago
This is intentional -- the question-SQL pairs as user/assistant messages is in-context learning and how the LLM will know to follow the pattern and generate SQL correctly.
You can dial down the number of those that are used by setting the n_results_sql
config:
https://github.com/vanna-ai/vanna/blob/a72b842d420cf1fa061e5f97d45ea08051651ebb/src/vanna/chromadb/chromadb_vector.py#L25
Ok, I thought you wanted to use question_sql_list
in initial_prompt
.
Describe the bug
vanna version: 0.5.5
something wrong with
question_sql_list
, I suspect there might be an issue with the usage of question_sql_list. Every time a search is conducted, question_sql_list is added to the historical conversation, which unnecessarily consumes my GPT tokens. Could you please check this?For example, when I asked a few questions, the resulting historical conversation is as follows:
However, I am just using
vn_rtn = vn.ask(question="Are there any branches in Shanghai?", visualize=False).
.This is just a single-turn conversation, so why is it being constructed as a historical conversation?
To Reproduce
My complete code is as follows, I just change the
question
each time, you can run the code to reproduce the result:Expected behavior
a single-turn conversation.