Closed suryakumaran2611 closed 3 months ago
@suryakumaran2611, yeah, if you have deployed all three models (the one you will use as multi-modal, LLM and embedding model), your context file should look like this:
from lavague.core.token_counter import TokenCounter
from lavague.contexts.openai import AzureContext
# declare the token counter before any LLMs are initialized
token_counter = TokenCounter()
# init context
context = AzureContext(
api_key="<YOUR_API_KEY>",
deployment="<YOUR_DEPLOYMENT_NAME>",
llm="MODEL_NAME",
mm_llm="M0DEL_NAME",
endpoint="<YOUR_ENDPOINT>",
embedding="<EMBEDDING_MODEL_NAME>",
embedding_deployment="<YOUR_EMBEDDING_DEPLOYMENT_NAME>"
)
Then you save your file and pass the path of your context file to the lavague-qa CLI command, i.e. lavague-test -c ./my_context.py
.
If you only want to use Azure for some but not all three models let me know and I can show you an example of this too.
This guide gives more info on Azure Contexts - you just need to define a Context object called context
in your lavague-qa config file: https://docs.lavague.ai/en/latest/docs/integrations/azure/
I would like to use LaVague QA with azure open ai service , can you give me a sample configuration on how to do custom context for azure ?