IBM / unitxt

🦄 Unitxt: a python library for getting data fired up and set for training and evaluation
https://www.unitxt.ai
Apache License 2.0
161 stars 45 forks source link

Add support for OpenAI custom base url and default headers + RITS Inference engine #1385

Closed martinscooper closed 7 hours ago

martinscooper commented 4 days ago
OfirArviv commented 10 hours ago

@yoavkatz Is RITS available from outside the network? Maybe that is why the test fails

martinscooper commented 8 hours ago

@OfirArviv @yoavkatz IMO we should remove the VLLMRemoteInferenceEngine as it adds no functionality to OpenAiInferenceEngine. In addition, other servers/frameworks can potentially expose LLMs using the OpenAI API, so this is not something exclusive to a remote vLLM instance. Users can directly create an OpenAiInferenceEngine and set the lable accordingly.

OfirArviv commented 8 hours ago

@OfirArviv @yoavkatz IMO we should remove the VLLMRemoteInferenceEngine as it adds no functionality to OpenAiInferenceEngine. In addition, other servers/frameworks can potentially expose LLMs using the OpenAI API, so this is not something exclusive to a remote vLLM instance. Users can directly create an OpenAiInferenceEngine and set the lable accordingly.

We cannot right now because some users are using it. But I defined it as minimal as it can be, basically just setting the label