Closed martinscooper closed 7 hours ago
@yoavkatz Is RITS available from outside the network? Maybe that is why the test fails
@OfirArviv @yoavkatz IMO we should remove the VLLMRemoteInferenceEngine
as it adds no functionality to OpenAiInferenceEngine
. In addition, other servers/frameworks can potentially expose LLMs using the OpenAI API, so this is not something exclusive to a remote vLLM instance. Users can directly create an OpenAiInferenceEngine
and set the lable accordingly.
@OfirArviv @yoavkatz IMO we should remove the
VLLMRemoteInferenceEngine
as it adds no functionality toOpenAiInferenceEngine
. In addition, other servers/frameworks can potentially expose LLMs using the OpenAI API, so this is not something exclusive to a remote vLLM instance. Users can directly create anOpenAiInferenceEngine
and set the lable accordingly.
We cannot right now because some users are using it. But I defined it as minimal as it can be, basically just setting the label