Open cyyeh opened 3 weeks ago
This task is scheduled to assign to the contributor Gomaa
@MGomaa435
implementation plan doc
get_generator
method in each LLMProvider, we can implement a higher-order function that accepts model params and returns a function. when we invoke the returned function, it executes the llm generation api, which mean acompletion
in litellmHi Jimmy, my pleasure to support in this
Currently we use Haystack to provide various LLM providers; however, we found it's still too hard for the community to contribute to various LLMs. We plan to use litellm instead; it provides more functionalities such as fallback, and it provides openai api compatible interface for all llms. We think it's easier for the community's contribution