Open YongzhengCui opened 2 years ago
I pushed a fix. Please try again.
Thanks for your reply, the problem is solved. However, there is a problem when I try to generate the training data, the log shows "0", no data is generated under the specified directory. The command I run is:
python store_episodes_parallel.py --gpu_capacity 1 --scenes_list HxpKQynjfin --episodes_save_dir /my_model --root_path /home/ros/cyz/habitat-lab --episode_len 10
The corresponding log output is:
I0424 21:41:57.548833 12079 simulator.py:213] Loaded navmesh /home/ros/cyz/habitat-labhabitat-api/data/scene_datasets/mp3d/HxpKQynjfin/HxpKQynjfin.navmesh 0 Now the pool is closed and no longer available
Looking forward to hearing from you, thank you very much.
This is looks like a path issue. The path that you posted: I0424 21:41:57.548833 12079 simulator.py:213] Loaded navmesh /home/ros/cyz/habitat-labhabitat-api/data/scene_datasets/mp3d/HxpKQynjfin/HxpKQynjfin.navmesh needs to be fixed.
Hi, I am also plagued by this problem. The path to my data is:/home/cbl/habitat-lab/habitat-api/data/scene_datasets/mp3d/HxpKQynjfin/HxpKQynjfin.navmesh, but also unable to generate data.
Did you confirm that is the right path to the glb and navmesh files? Probably you don't need both habitat-lab and habitat-api in the path.
Yes, I'm sure the glb and navmesh files can be found in the corresponding directory.
Yes, I'm sure the glb and navmesh files can be found in the corresponding directory.
@BoLeiChen have you solve this problem,I meet the same
@Cyz981126 have you solve this problem,I meet the same.thank you very much.
Hi,
I have a problem, when I run the command
python store_episodes_parallel.py --gpu_capacity 1 --scenes_list HxpKQynjfin --episodes_save_dir /home/ros/cyz/UPEN/my_models/ --root_path /home/ros/cyz/habitat-lab --episode_len 10
an error like below occurs.options: split train grid_dim 768 crop_size 160 cell_size 0.05 turn_angle 30 forward_step_size 0.25 n_spatial_classes 3 img_size 256 max_num_episodes 2500 episode_len 10 truncate_ep True occupancy_height_thresh -1.0 scenes_list ['HxpKQynjfin'] root_path /home/ros/cyz/habitat-lab episodes_path habitat-api/data/datasets/objectnav/mp3d/ ep_set v1 episodes_root scenes_dir habitat-api/data/scene_datasets/ episodes_save_dir /home/ros/cyz/UPEN/my_models/ gpu_capacity 1 multiprocessing.pool.RemoteTraceback: """ Traceback (most recent call last): File "/home/ros/anaconda3/envs/UPEN/lib/python3.6/multiprocessing/pool.py", line 119, in worker result = (True, func(*args, **kwds)) File "/home/ros/anaconda3/envs/UPEN/lib/python3.6/multiprocessing/pool.py", line 47, in starmapstar return list(itertools.starmap(args[0], args[1])) File "/home/ros/cyz/UPEN/store_episodes_parallel.py", line 67, in store_episodes data = HabitatDataScene(options, config_file, scene_id=scene_id, existing_episode_list=existing_episode_list) File "/home/ros/cyz/UPEN/datasets/dataloader.py", line 87, in init cfg.ENVIRONMENT.MAX_EPISODE_STEPS = options.max_steps AttributeError: 'Namespace' object has no attribute 'max_steps' """
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "store_episodes_parallel.py", line 153, in
pool.starmap(store_episodes, args)
File "/home/ros/anaconda3/envs/UPEN/lib/python3.6/multiprocessing/pool.py", line 274, in starmap
return self._map_async(func, iterable, starmapstar, chunksize).get()
File "/home/ros/anaconda3/envs/UPEN/lib/python3.6/multiprocessing/pool.py", line 644, in get
raise self._value
AttributeError: 'Namespace' object has no attribute 'max_steps'