facebookresearch / habitat-lab

A modular high-level library to train embodied AI agents across a variety of tasks and environments.
https://aihabitat.org/
MIT License
1.98k stars 492 forks source link

Humanoids_tutorial.ipynb: No such file or directory: 'data/hab3_bench_assets/episode_datasets/small_large.json.gz' #1993

Open Skevinci opened 4 months ago

Skevinci commented 4 months ago

Habitat-Lab and Habitat-Sim versions

Habitat-Lab: 0.3.1

Habitat-Sim: 0.3.1

🐛 Bug

When running the init env code in the examples/tutorials/humanoids_tutorial.ipynb, it report error that:

FileNotFoundError Traceback (most recent call last) Cell In[4], line 6 1 # Define the actions 3 action_dict = { 4 "humanoid_joint_action": HumanoidJointActionConfig() 5 } ----> 6 env = init_rearrange_env(agent_dict, action_dict)

Cell In[2], line 46 44 hab_cfg = make_hab_cfg(agent_dict, action_dict) 45 res_cfg = OmegaConf.create(hab_cfg) ---> 46 return Env(res_cfg)

File ~/research/habitat-lab/habitat-lab/habitat/core/env.py:88, in Env.init(self, config, dataset) 86 self._dataset = dataset 87 if self._dataset is None and config.dataset.type: ---> 88 self._dataset = make_dataset( 89 id_dataset=config.dataset.type, config=config.dataset 90 ) 92 self._current_episode = None 93 self._episode_iterator = None

File ~/research/habitat-lab/habitat-lab/habitat/datasets/registration.py:22, in make_dataset(id_dataset, kwargs) 19 _dataset = registry.get_dataset(id_dataset) 20 assert _dataset is not None, "Could not find dataset {}".format(id_dataset) ---> 22 return _dataset(kwargs)

File ~/research/habitat-lab/habitat-lab/habitat/datasets/rearrange/rearrange_dataset.py:74, in RearrangeDatasetV0.init(self, config) 70 logger.info("Downloaded and extracted the data.") 72 check_and_gen_physics_config() ---> 74 super().init(config)

File ~/research/habitat-lab/habitat-lab/habitat/datasets/pointnav/pointnav_dataset.py:116, in PointNavDatasetV1.init(self, config) 112 return 114 datasetfile_path = config.data_path.format(split=config.split) --> 116 self._load_from_file(datasetfile_path, config.scenes_dir) 118 # Read separate file for each scene 119 dataset_dir = os.path.dirname(datasetfile_path)

File ~/research/habitat-lab/habitat-lab/habitat/datasets/pointnav/pointnav_dataset.py:105, in PointNavDatasetV1._load_from_file(self, fname, scenes_dir) 103 self.from_binary(pickle.load(f), scenes_dir=scenes_dir) 104 else: --> 105 with gzip.open(fname, "rt") as f: 106 self.from_json(f.read(), scenes_dir=scenes_dir)

File ~/miniforge3/envs/habitat/lib/python3.9/gzip.py:58, in open(filename, mode, compresslevel, encoding, errors, newline) 56 gz_mode = mode.replace("t", "") 57 if isinstance(filename, (str, bytes, os.PathLike)): ---> 58 binary_file = GzipFile(filename, gz_mode, compresslevel) 59 elif hasattr(filename, "read") or hasattr(filename, "write"): 60 binary_file = GzipFile(None, gz_mode, compresslevel, filename)

File ~/miniforge3/envs/habitat/lib/python3.9/gzip.py:173, in GzipFile.init(self, filename, mode, compresslevel, fileobj, mtime) 171 mode += 'b' 172 if fileobj is None: --> 173 fileobj = self.myfileobj = builtins.open(filename, mode or 'rb') 174 if filename is None: 175 filename = getattr(fileobj, 'name', '')

FileNotFoundError: [Errno 2] No such file or directory: 'data/hab3_bench_assets/episode_datasets/small_large.json.gz'

Steps to Reproduce

Steps to reproduce the behavior:

  1. Follow the installation in the README of habitat-lab
  2. Follow the humanoids_tutorial.ipynb under example/ till "env = init_rearrange_env(agent_dict, action_dict)"

Please note that without a minimal working example to reproduce the bug, we may not be able to help you.

Expected behavior

env is init and I can step forward to the tutorial

Additional context

xavierpuigf commented 4 months ago

You can find the dataset here: https://huggingface.co/datasets/ai-habitat/hab3_bench_assets/tree/main/episode_datasets It can be downloaded using the command:

python -m habitat_sim.utils.datasets_download --uids hab3_bench_assets

Will update the tutorial to show that.

Skevinci commented 4 months ago

Thank you!