facebookresearch / habitat-sim

A flexible, high-performance 3D simulator for Embodied AI research.
https://aihabitat.org/
MIT License
2.58k stars 418 forks source link

Load Self-built Mesh in Habitat-smi and get the semantic information #2404

Closed hutslib closed 2 months ago

hutslib commented 4 months ago

How to built my self mesh and load them in habitat. I also want to give each assert a semantic label and I want to know how to arrange the data format.

aclegg3 commented 2 months ago

Hey @hutslib

This is a pretty broad topic so I'll provide some pointers and you can open a new issue if there are specific blockers you encounter.

First, I suggest you take a look at existing data sources for reference:

  1. habitat_test_scenes: these are individual meshes which are loaded as scenes. The tutorial on navigation basics demonstrates how to load the simulator with your custom scene asset and setup the agent/sensors.
  2. ReplicaCAD: https://aihabitat.org/datasets/replica_cad/ - this is a great reference for generating scenes composed of multiple objects and assigning each object a semantic id. The object_config.json has the semantic_id integer which is mapped to the classes in the semantic lexicon. This is the easiest way to add semantics, but operates on a per-mesh level, so it won't do for scans or multi-part semantic meshes.
  3. HM3D Semantics: https://aihabitat.org/datasets/hm3d-semantics/ - This dataset has scene mesh semantics baked into texture files which can be read by the simulator. This requires painting the annotations in modeling software like Blender. You can quickly download the example scenes with the datasets_download.py utility targetting data group hm3d_example.

Finally, you can check out the documentation on our configuration format here: https://aihabitat.org/docs/habitat-sim/attributesJSON.html for details on how to setup individual configs.

I suggest you start from an existing dataset to understand how the pieces fit together.