Closed HusterYoung closed 1 year ago
Hi, evaluation using multiple environments is not supported currently.
Hi, evaluation using multiple environments is not supported currently.
Thanks for your reply, so is there any other way to speed up the evaluation? In the current situation, it takes 4-5 hours to evaluate 100 episodes.
Yeah, I meet the same issue about the slow inference when I run the objectnav_agent.py which uses the semantic map + frontier-based exploreation policy. May I ask how you solve the problem eventually?
Hi, evaluation using multiple environments is not supported currently.
Hello, is parallel evaluation supported now?
In hssd_eval.yaml, there is a factor called "NUM_ENVIRONMENTS" and it is set to 1 as default(20 in training). When I set it to a value greater than one, an error will be reported as follows:
Loading pretrained CLIP
Traceback (most recent call last):
File "/home/young/mambaforge/envs/home-robot/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, kwargs) File "/home/young/home-robot/src/home_robot/home_robot/agent/objectnav_agent/objectnav_agent.py", line 208, in prepare_planner_inputs) = self.module( File "/home/young/mambaforge/envs/home-robot/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, *kwargs) File "/home/young/mambaforge/envs/home-robot/lib/python3.9/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward return self.module(inputs[0], kwargs[0]) File "/home/young/mambaforge/envs/home-robot/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, kwargs) File "/home/young/home-robot/src/home_robot/home_robot/agent/objectnav_agent/objectnav_agent_module.py", line 145, in forward) = self.semantic_map_module( File "/home/young/mambaforge/envs/home-robot/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl return forward_call(*input, *kwargs) File "/home/young/mambaforge/envs/home-robot/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(args, kwargs) File "/home/young/home-robot/src/home_robot/home_robot/mapping/semantic/categorical_2d_semantic_map_module.py", line 254, in forward local_map, local_pose = self._update_local_map_and_pose( File "/home/young/home-robot/src/home_robot/home_robot/mapping/semantic/categorical_2d_semantic_map_module.py", line 615, in _update_local_map_and_pose rot_mat, trans_mat = ru.get_grid(st_pose, agent_view.size(), dtype) File "/home/young/home-robot/src/home_robot/home_robot/utils/rotation.py", line 106, in get_grid rot_grid = F.affine_grid(theta1, torch.Size(grid_size), align_corners=False).to( File "/home/young/mambaforge/envs/home-robot/lib/python3.9/site-packages/torch/nn/functional.py", line 4332, in affine_grid return torch.affine_grid_generator(theta, size, align_corners) RuntimeError: Expected size for first two dimensions of batch2 tensor to be: [1, 3] but got: [2, 3].
Maybe someone knows what the reason is?