Is this a new feature, an improvement, or a change to existing functionality?
Change
How would you describe the priority of this feature request
Medium
Please provide a clear description of problem this feature solves
I want to be able to instantiate LLM services/clients dynamically, so that models can be swapped out with command line arguments or config files instead of having to refactor pipeline code.
Is this a new feature, an improvement, or a change to existing functionality?
Change
How would you describe the priority of this feature request
Medium
Please provide a clear description of problem this feature solves
I want to be able to instantiate LLM services/clients dynamically, so that models can be swapped out with command line arguments or config files instead of having to refactor pipeline code.
Describe your ideal solution
llm_service = LLMService.create(service, service_kwargs) llm_client = llm_service.get_client(model_kwargs)
Additional context
No response
Code of Conduct