Open LMJOK opened 4 days ago
@LMJOK Hi, can you provide your test script?
@LMJOK Hi, can you provide your test script?嗨,你能提供你的测试脚本吗?
I basically run it as it is, just modifying the API calls
elif args.model == "gpt-4o-2024-05-13": import openai
print(f"Using OpenAI API with model {args.model}.")
client_model = "gpt-4o-2024-05-13"
client = openai.OpenAI(
api_key="XXXXXXXXXXXXXXXXXXXXXXX",
base_url="https://api.gpt.ge/v1/"
)
运行的脚本:python launch_scientist.py --model "gpt-4o-2024-05-13" --experiment nanoGPT --num-ideas 5
/home/pc/anaconda3/envs/qwen2-vl/bin/python /media/pc/Code/pfz/Qwen2-VL/tset.py Traceback (most recent call last): File "/media/pc/Code/pfz/Qwen2-VL/tset.py", line 12, in
chat_response = client.chat.completions.create(
File "/home/pc/anaconda3/envs/qwen2-vl/lib/python3.10/site-packages/openai/_utils/_utils.py", line 274, in wrapper
return func(*args, **kwargs)
File "/home/pc/anaconda3/envs/qwen2-vl/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 704, in create
return self._post(
File "/home/pc/anaconda3/envs/qwen2-vl/lib/python3.10/site-packages/openai/_base_client.py", line 1265, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/home/pc/anaconda3/envs/qwen2-vl/lib/python3.10/site-packages/openai/_base_client.py", line 942, in request
return self._request(
File "/home/pc/anaconda3/envs/qwen2-vl/lib/python3.10/site-packages/openai/_base_client.py", line 1046, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': '', 'type': 'BadRequestError', 'param': None, 'code': 400}