run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.23k stars 4.95k forks source link

[Question]: How do i pass custom llm model to NLSQLTableQueryEngine #7333

Closed vp999 closed 1 year ago

vp999 commented 1 year ago

Question Validation

Question

I want to use llm finetuned on MS sql server dialect for generating Tsql statement. NLSQLTableQueryEngine by default uses open ai and if not found then llamacpp. How do i pass my fine tuned llm for convesion of NL text to Tsql statement?

dosubot[bot] commented 1 year ago

🤖

Hello,

To use your fine-tuned LLM model on MS SQL Server dialect with NLSQLTableQueryEngine to convert natural language text to TSQL statements, you need to initialize the NLSQLTableQueryEngine with your SQLDatabase instance and the fine-tuned LLM model.

Here is an example of how you can do this:

from llama_index.langchain_helpers.sql_wrapper import SQLDatabase
from llama_index.indices.service_context import ServiceContext
from llama_index.indices.struct_store.sql_query import NLSQLTableQueryEngine
from llama_index.prompts.base import Prompt

# Initialize your SQLDatabase instance with 'mssql' dialect
sql_database = SQLDatabase(dialect='mssql', ...)

# Initialize your ServiceContext with the fine-tuned LLM model
service_context = ServiceContext(llm_predictor=your_fine_tuned_llm_model, ...)

# Initialize your NLSQLTableQueryEngine
query_engine = NLSQLTableQueryEngine(
    sql_database=sql_database,
    service_context=service_context,
    text_to_sql_prompt=Prompt(your_text_to_sql_prompt, prompt_type=PromptType.TEXT_TO_SQL),
    tables=["your_table_name"],
    ...
)

In the above code, replace your_fine_tuned_llm_model with your fine-tuned LLM model, your_text_to_sql_prompt with the prompt you want to use for converting natural language to SQL, your_table_name with the name of the table you want to query, and ... with other necessary arguments.

Please note that the text_to_sql_prompt should be an instance of Prompt and its prompt_type should be PromptType.TEXT_TO_SQL. The text_to_sql_prompt is used to guide the LLM model to convert natural language text to SQL statements. The tables parameter in the NLSQLTableQueryEngine should be a list of table names you want to query. If you don't specify the tables, the query engine will pull all the schema context, which could overflow the context window of the LLM.

You can find more information in the sql_query.py and sql_guide.md files in the LlamaIndex repository.

As for the llm_predictor parameter in the ServiceContext class, it is an instance of BaseLLMPredictor and is used to predict the output of the LLM in the LlamaIndex framework. It manages the LLM, including setting system prompts and query wrapper prompts, and handling callbacks. It also provides metadata for the PromptHelper to generate prompts for the LLM.

I hope this helps! If you have any other questions, feel free to ask.

Best, Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

vp999 commented 1 year ago

I'll give it a try. closing this