yding25 / GPT-Planner

Paper: Integrating Action Knowledge and LLMs for Task Planning and Situation Handling in Open Worlds
https://cowplanning.github.io/
MIT License
26 stars 3 forks source link

Error in api #5

Open liushuown opened 1 month ago

liushuown commented 1 month ago

When I run the code "python main.py" I received the following error

before running, all old files have been removed.

setting:0

---------- generating basic plan! -----------

(find_table rob table_0 dining) (walk rob dining kitchen) (find_plate rob plate_1 kitchen) (grasp_plate rob plate_1 kitchen) (move_plate rob plate_1 kitchen table_0 dining) (place_plate rob plate_1 table_0 dining) (walk_table rob table_0 dining) (find_chair rob chair_1 dining) (pull_chair rob chair_1 dining) (walk rob dining kitchen) (find_burger rob burger_1 kitchen) (find_fork rob fork_1 kitchen) (grasp_burger rob burger_1 kitchen) (move_burger rob burger_1 kitchen plate_1 dining) (place_burger rob burger_1 plate_1 dining) (walk rob dining kitchen) (grasp_fork rob fork_1 kitchen) (move_fork rob fork_1 kitchen plate_1 dining) (place_fork rob fork_1 plate_1 dining) ; cost = 19 (unit cost) Traceback (most recent call last): File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/http_client.py", line 221, in request kwargs, File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, send_kwargs) File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, kwargs) File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/requests/adapters.py", line 497, in send chunked=chunked, File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/urllib3/connectionpool.py", line 803, in urlopen response_kw, File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/urllib3/connectionpool.py", line 505, in _make_request enforce_content_length=enforce_content_length, File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/urllib3/connection.py", line 394, in request self.putheader(header, value) File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/urllib3/connection.py", line 308, in putheader super().putheader(header, *values) File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/http/client.py", line 1259, in putheader raise ValueError('Invalid header value %r' % (values[i],)) ValueError: Invalid header value b'Bearer sk-Ma4ZnyqCPja**B4\n'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "main.py", line 77, in situation_index, situation, situation_opp, situation_object, situation_predicate, situation_action = situation_simulator(task_id) File "/home/liushuo/GPT-Planner/situation_simulator.py", line 824, in situation_simulator situation_predicate = predicate_generator(situation) File "/home/liushuo/GPT-Planner/utility.py", line 253, in predicate_generator stop=['\n', '.'] File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/api_resources/completion.py", line 31, in create return super().create(*args, **kwargs) File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 67, in create "post", url, params, headers, stream=stream File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/api_requestor.py", line 127, in request method.lower(), url, params, headers, stream=stream File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/api_requestor.py", line 322, in request_raw method, abs_url, headers, post_data, stream=stream File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/http_client.py", line 87, in request_with_retries raise connection_error File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/http_client.py", line 58, in request_with_retries response = self.request(method, url, headers, post_data, stream=stream) File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/http_client.py", line 246, in request self._handle_request_error(e) File "/home/liushuo/anaconda3/envs/cowplanning/lib/python3.7/site-packages/openai/http_client.py", line 306, in _handle_request_error raise error.APIConnectionError(msg, should_retry=should_retry) openai.error.APIConnectionError: Unexpected error communicating with OpenAI. It looks like there's probably a configuration issue locally. If this problem persists, let us know at support@openai.com.

(Network error: A ValueError was raised with error message Invalid header value b'Bearer sk-Ma4ZnyqCPja**B4\n')

However, I have tested my api_key use the following code, and it can run sucessfully.

from langchain_openai import ChatOpenAI llm = ChatOpenAI( openai_api_base="https://api.xiaoai.plus/v1", openai_api_key="sk-xxxxx" ) res = llm.invoke("hello") print(res.content)

How can I fix this error?

yding25 commented 1 month ago

Thanks for trying the script. I believe you tested an older version. Could you please try the Colab version instead? I strongly recommend using the Colab version as it is easier to run. This project was built a few years ago, and I'm not sure if some APIs in the Colab version, such as ChatGPT, have been updated.

liushuown commented 1 month ago

Hi, thanks for your help! As you said, I changed "def llm", "def predicate_generator", and updated my openai version == 1.9.0 , API problem seems to be solved! However, I'm still a bit puzzled, when my task id =1, currently my output looks like this:

before running, all old files have been removed.

setting:1

---------- generating basic plan! -----------

(walk rob kitchen dining) (find_table rob table_0 dining) (grasp_vacuum rob vacuum_2 dining) (plug_vacuum rob vacuum_2 outlet_1 dining) (turnon_vacuum rob vacuum_2 dining) (clean_area rob vacuum_2 table_0 dining) (turnoff_vacuum rob vacuum_2 dining) (unplug_vacuum rob vacuum_2 dining) ; cost = 8 (unit cost) utility.py predicate_generator result: ower_outage_exists

---------- generating situation! -----------

situation index: 0 situation: there is a power outage. action corresponding to situation: turnon_vacuum corresponding predicate: power_outage_exists object manipulated by robot: vacuum_2 object in situation: power

---------- executing plan! -----------

action: ['walk', 'rob', 'kitchen', 'dining'] action (decoded): a robot walks from a kitchen room to a dining room. this action is executed!

action: ['find_table', 'rob', 'table_0', 'dining'] action (decoded): a robot finds a table in a dining room. this action is executed!

action: ['grasp_vacuum', 'rob', 'vacuum_2', 'dining'] action (decoded): a robot grasps a vacuum in a dining room. this action is executed!

action: ['plug_vacuum', 'rob', 'vacuum_2', 'outlet_1', 'dining'] action (decoded): a robot plugs a vacuum using outlet in a dining room. this action is executed!

action: ['turnon_vacuum', 'rob', 'vacuum_2', 'dining'] action (decoded): a robot turns on a vacuum in a dining room.

---------- checking unexecuted actions! -----------

['turnon_vacuum', 'rob', 'vacuum_2', 'dining'] ['clean_area', 'rob', 'vacuum_2', 'table_0', 'dining'] ['turnoff_vacuum', 'rob', 'vacuum_2', 'dining'] ['unplug_vacuum', 'rob', 'vacuum_2', 'dining']

unexecuted action: ['turnon_vacuum', 'rob', 'vacuum_2', 'dining'] unexecuted action (decoded): a robot turns on a vacuum in a dining room. ! prompt design raw prompt: is it suitable that a robot turns on a vacuum if there is a power outage? ! experience found ! response from LLM response (raw prompt): a

---------- action can be executed! -----------

unexecuted action: ['clean_area', 'rob', 'vacuum_2', 'table_0', 'dining'] unexecuted action (decoded): a robot cleans the area beside a table using vacuum in a dining room. ! prompt design raw prompt: is it suitable that a robot cleans the area beside a table using vacuum if there is a power outage? ! experience found ! response from LLM response (raw prompt): no,

---------- current plan cannot be executed! -----------

---------- adding constraint -----------

! step 1: supplement constraint to action precondition step 1 is done! ! step 2: supplement action's parameter step 2 is done! ! step 3: supplement type step 3 is done! ! step 4: supplement predicates step 4 is done! ! step 5: supplement init step 5 is done! ! step 6: supplement object step 6 is done!

---------- generating modified_plan_1! -----------

---------- no modified_plan_1 found! -----------

---------- call llm_object -----------

---------- object that robot can operate: ----------

['cutlery knife', 'dish bowl', 'coffee cup', 'cutlery fork', 'mat', 'measuring cup', 'bucket', 'cleaning bottle', 'frying pan', 'drinking glass', 'oven tray', 'cooking pot', 'trash can', 'cutting board']

---------- capable_objects: ----------

None

---------- most possible object: ----------

None

no modified_plan_2 found!

---------- call llm_appliance -----------

appliance that robot can operate: ['toaster', 'juicer', 'fridge', 'coffee maker', 'water filter', 'stove', 'microwave', 'ice cream maker', 'dishwasher']

! prompt design prompt (raw): can a toaster make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

! prompt design prompt (raw): can a juicer make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

! prompt design prompt (raw): can a fridge make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

! prompt design prompt (raw): can a coffee maker make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

! prompt design prompt (raw): can a water filter make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

! prompt design prompt (raw): can a stove make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

! prompt design prompt (raw): can a microwave make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

! prompt design prompt (raw): can a ice cream maker make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

! prompt design prompt (raw): can a dishwasher make power available if there is a power outage? answer: ! experience found ! results from LLM response (raw prompt): no

no capable appliance found!

no modified_plan_3 found!

---------- solution -----------

none of modified_plan_1, modified_plan_2 and modified_plan_3 exist.

Is that right?

yding25 commented 1 month ago

Thank you for revising the "def llm" and "def predicate_generator". I'm glad to see that you achieved the desired result. In your case, where "there is a power outage," there is no solution to resolve this situation. This is an expected outcome, as we discussed similar cases in the paper.