Closed serser closed 2 months ago
By they way, if I use local LLM judge, do I need to set OPENAI_API_KEY like this?
lmdeploy serve api_server internlm/internlm2-chat-1_8b --server-port 23333 --model-name internlm2-chat-1_8b --api-keys sk-123456
It is probably due the firewall on the port. I switched to another one and explicitly specified server name with 0.0.0.0 and api-keys with suggested ones. And it now calls the local judge server on each output.
I've set up a judge with
And now when I try with evaluation
It gives me
openai.InternalServerError: Bad Gateway
like the following,I've tried with
curl
and the judge is all good to respond. What could be the problem here for the bad gateway?