Open EdoardoAbatiTR opened 4 months ago
I solved this in my PR for local evaluation support but decided to not proceed with the PR: https://github.com/deepset-ai/haystack/pull/7745
You can take what I built, strip the llama.cpp bits, and keep the generation_kwargs sections.
So after looking more into it, Azure has their own AzureOpenAI class in the openai package. While some other services use the openai api and allow us to redirect to their api (local or hosted), that doesn't seem to be possible for Azure using base_url anymore: https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/migration
However, I was able to extend the LLMEvaluators to support parameters for OpenAI and in the future if any other providers are added, it would work for them as well.
@EdoardoAbatiTR Let's reopen this. I didn't mean for this to be automatically closed upon merging of my PR as my PR doesn't solve 100% of what you requested. Feel free to follow what I posted here to implement support for Azure with the new changes: https://github.com/deepset-ai/haystack/pull/7987#issuecomment-2220922798
Is your feature request related to a problem? Please describe.
LLMEvaluator
currently only supportsOpenAI
, it would be nice if we could use it with the OpenAI models via Azure too.Describe the solution you'd like
I'd like to use evaluators with Azure OpenAI (e.g.
ContextRelevanceEvaluator(api='azure-openai')
)In addition, I propose to slightly change the design of
LLMEvaluator
to allow more flexibility. Currently the paramsapi_key=Secret.from_env_var("OPENAI_API_KEY")
forces the user to provide an env var that is specific to OpenAI, and would not be used by other generators.What about having something like:
?
This wouldn't force the user to provide to
LLMEvaluator
anything specific to the generator. It gives the flexibility to pass anything that the generator can take (e.g. api keys, api version, orazure_deployment
in case of Azure) via thegenerator_kwargs
. At the same time, if the user doesn't pass anything, the generator would still look for its required env vars during instantiation.I guess
api_key
needs to enter the deprecation cycle before being removed. Maybe we could just change toapi_key=Secret.from_env_var("OPENAI_API_KEY", strict=False)
until deprecated, so that that var will not be required for other generators.Describe alternatives you've considered
Subclassing the
LLMEvaluator
(and all the child classes ) into a custom componentAdditional context
Happy to hear your thoughts, also in case there are other better solutions I didn't consider. :)
I'm currently a bit busy with other things, but I may be able to raise PR with the proposal in the next days.