I tried Llama-2-7b-chat-hf model on the programming tasks using the prompt for CodeLlama model. The dataset I use is mbpp-py.jsonl. But it seems like the prompt doesn't suit the Llama-2-7b-chat-hf model very well and it gives me an error:
GENERATED FUNC BODY
None
Traceback (most recent call last):
File "/home/user/botao/reflexion-main/programming_runs/my_main_llama2.py", line 128, in
main(args)
File "/home/user/botao/reflexion-main/programming_runs/my_main_llama2.py", line 111, in main
run_strategy(
File "/home/user/botao/reflexion-main/programming_runs/my_main_llama2.py", line 51, in kwargs_wrapper
return func(**kwargs)
File "/home/user/botao/reflexion-main/programming_runs/reflexion.py", line 75, in run_reflexion
assert isinstance(cur_func_impl, str)
AssertionError
I tried Llama-2-7b-chat-hf model on the programming tasks using the prompt for CodeLlama model. The dataset I use is
mbpp-py.jsonl
. But it seems like the prompt doesn't suit the Llama-2-7b-chat-hf model very well and it gives me an error:GENERATED FUNC BODY None Traceback (most recent call last): File "/home/user/botao/reflexion-main/programming_runs/my_main_llama2.py", line 128, in
main(args)
File "/home/user/botao/reflexion-main/programming_runs/my_main_llama2.py", line 111, in main
run_strategy(
File "/home/user/botao/reflexion-main/programming_runs/my_main_llama2.py", line 51, in kwargs_wrapper
return func(**kwargs)
File "/home/user/botao/reflexion-main/programming_runs/reflexion.py", line 75, in run_reflexion
assert isinstance(cur_func_impl, str)
AssertionError
Is there a better prompt for this model?