[ICML'24] SeeAct is a system for generalist web agents that autonomously carry out tasks on any given website, with a focus on large multimodal models (LMMs) such as GPT-4V(ision).
Thanks for your work. I try to evaluate the offline results. I run python offline_experiments/offline_experiment.py (I have add my Key to the code). The model I use is gpt-4-vision-preview/gpt-4-turbo. But I got the following error:
Traceback (most recent call last):
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/site-packages/openai/api_requestor.py", line 753, in _interpret_response_line
data = json.loads(rbody)
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/json/__init__.py", line 346, in loads
return _default_decoder.decode(s)
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/data/users/zhangjunlei/code/SeeAct/src/offline_experiments/offline_experiment.py", line 71, in <module>
output0 = generation_model.generate(
File "/data/users/zhangjunlei/code/SeeAct/src/./demo_utils/inference_engine.py", line 123, in generate
response1 = openai.ChatCompletion.create(
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
return super().create(*args, **kwargs)
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
response, _, api_key = requestor.request(
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request
resp, got_stream = self._interpret_response(result, stream)
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response
self._interpret_response_line(
File "/data/users/zhangjunlei/anaconda3/envs/webagent/lib/python3.10/site-packages/openai/api_requestor.py", line 755, in _interpret_response_line
raise error.APIError(
openai.error.APIError: HTTP code 413 from API (<html>
<head><title>413 Request Entity Too Large</title></head>
<body>
<center><h1>413 Request Entity Too Large</h1></center>
<hr><center>openresty</center>
</body>
</html>
)
Do you have any idea about how to solve this problem? Thank you!
Thanks for your work. I try to evaluate the offline results. I run
python offline_experiments/offline_experiment.py
(I have add my Key to the code). The model I use is gpt-4-vision-preview/gpt-4-turbo. But I got the following error:Do you have any idea about how to solve this problem? Thank you!