Closed AU-CPU closed 2 days ago
@@Yanbin-Yin
line6: from reasoners.tools import wikisearch,wikilookup,wikifinish
In this example we just use these three tools. And you can just use these tools to define a toolset.
I use from reasoners.tools import wikisearch,wikilookup,wikifinish and change code to " if "Search" in action: new_state = current_state + self.toolset[0](env=self.base_model, step_idx=step_idx, action=action, thought=thought)
print("Search tool is used")
elif "Lookup" in action:
new_state = current_state + self.toolset[0](env=self.base_model, step_idx=step_idx, action=action, thought=thought)
# new_state = current_state + HotpotQATools.wikilookup(env=self.base_model, step_idx=step_idx, action=action, thought=thought)
print("Lookup tool is used")
elif "Finish" in action:
new_state = current_state + self.toolset[0](env=self.base_model, step_idx=step_idx, action=action, thought=thought)
# new_state = current_state + HotpotQATools.wikifinish(env=self.base_model, step_idx=step_idx, action=action, thought=thought)
print("Finish tool is used")"
correct=False, output=None, answer='Chief of Protocol';accuracy=0.000 (0/2)
such as this
This means that you doesn't actually use these tools. We have three tools, but you just use toolset[0].
You can print "new_state" and "current_state" to find whether these tools are used.
Looks like the error is here, I print try: print("action",action) thought, action = action.strip().split(f"\nAction {step_idx}: ") found action exceed maxlength
Looks like the error is here, I print try: print("action",action) thought, action = action.strip().split(f"\nAction {step_idx}: ") found action exceed maxlength
I guess this is because I used Llama-2-70B-GPTQ through ExLlamaModel. Because the meta-llama/Meta-Llama-3-8B I found does not have the params.json file, so I cannot use llama3 correctly. Can you provide the address of the llama you are using? @Yanbin-Yin
has been resolved
CUDA_VISIBLE_DEVICES=0 python3 -m torch.distributed.run --nproc_per_node 1 examples/ReAct/hotpotqa/inference.py --base_lm llama3 --model_dir /xx/llama3-8b/ --llama_size "8B" --batch_size 1 --temperature 0 Unable to run inference.py correctly.
reasoner = Reasoner( world_model=HotpotqaWorldModel(world_base_model,toolset), search_config=HotpotqaSearchConfig(base_model), search_algo=GreedySearch(7) ) TypeError: Hotpotqaevaluator.init() missing 1 required positional argument: 'toolset' It seems that the reason for the lack of definition of toolset here is that I did not find the definition of toolset in inference. I hope to get a reply.