Closed chuzixizi closed 1 month ago
In your case, could you test the endpoint api.openai.com
?
yes, i already test the api_key, which can normally return
I mean the llm_api_endpoint should be set to "api.openai.com" instead of "https://api.openai.com/v1/chat/completions".
It didn`t work
Could I have the detailed settings and the terminal response.
ubuntu 22.04
python 3.11
openai 1.25.0
model gpt-3.5-turbo
It is weird, I can run it with Openai API using the same endpoint setting and model. I suggest the following solutions:
thanks
I set the url, but there is a bug, what the correct llm_api_endpoint should be like![image](https://github.com/FeiLiu36/EoH/assets/92204327/2f85057a-42f9-4fd6-94f8-9c874264de69)