Open kaben opened 1 year ago
hello,Whenever run xxx
ends, this error will appear. Have you encountered this problem?
-==- -==- -==-
Traceback (most recent call last):
File "/data/generative_agents/reverie/backend_server/reverie.py", line 471, in open_server
rs.start_server(int_count)
File "/data/generative_agents/reverie/backend_server/reverie.py", line 379, in start_server
next_tile, pronunciatio, description = persona.move(
File "/data/generative_agents/reverie/backend_server/persona/persona.py", line 222, in move
plan = self.plan(maze, personas, new_day, retrieved)
File "/data/generative_agents/reverie/backend_server/persona/persona.py", line 148, in plan
return plan(self, maze, personas, new_day, retrieved)
File "/data/generative_agents/reverie/backend_server/persona/cognitive_modules/plan.py", line 959, in plan
_determine_action(persona, maze)
File "/data/generative_agents/reverie/backend_server/persona/cognitive_modules/plan.py", line 573, in _determine_action
generate_task_decomp(persona, act_desp, act_dura))
File "/data/generative_agents/reverie/backend_server/persona/cognitive_modules/plan.py", line 164, in generate_task_decomp
return run_gpt_prompt_task_decomp(persona, task, duration)[0]
File "/data/generative_agents/reverie/backend_server/persona/prompt_template/run_gpt_prompt.py", line 439, in run_gpt_prompt_task_decomp
output = safe_generate_response(prompt, gpt_param, 5, get_fail_safe(),
File "/data/generative_agents/reverie/backend_server/persona/prompt_template/gpt_structure.py", line 262, in safe_generate_response
return func_clean_up(curr_gpt_response, prompt=prompt)
File "/data/generative_agents/reverie/backend_server/persona/prompt_template/run_gpt_prompt.py", line 378, in __func_clean_up
duration = int(k[1].split(",")[0].strip())
IndexError: list index out of range
Error.
babytdream
Hi, I'm facing the same issue. Did you find a solution for the same?
@joonspk-research: your code comments indicate you're troubleshooting validation errors in
run_gpt_prompt_task_decomp(...)
:https://github.com/joonspk-research/generative_agents/blob/fe05a71d3e4ed7d10bf68aa4eda6dd995ec070f4/reverie/backend_server/persona/prompt_template/run_gpt_prompt.py#L364
https://github.com/joonspk-research/generative_agents/blob/fe05a71d3e4ed7d10bf68aa4eda6dd995ec070f4/reverie/backend_server/persona/prompt_template/run_gpt_prompt.py#L417
Here's an example GPT response that triggers the error:
The final line causes the error. Reason: the line is in the wrong format for the parsing code.
(The reason it's in the wrong format is that the LLM is trying to alert you to a separate error, which I think is interesting -- the LLM is trying to help you by telling you it made a mistake. You might be able to use this to your advantage by asking the LLM to report errors in a format you can easily parse and log in some way.)