Closed sid8491 closed 1 year ago
You can pass in a service context in
index.as_query_engine(service_context=service_context)
You can pass in a service context in
index.as_query_engine(service_context=service_context)
how to do so? i want to use different LLM for embeddings and different LLM for searching. could you please give some sample code.
Take a look at this for customizing embeddings: https://gpt-index--1183.org.readthedocs.build/en/1183/how_to/customization/embeddings.html
I know for building the index we can configure the LLM like below:
But how can I configure the LLM for querying?
Will the LLM be same for both querying and indexing? I want to use different LLM for indexing the data, then load the index and use different LLM to get the response of the query.
Is it possible?