facebookresearch / habitat-sim

A flexible, high-performance 3D simulator for Embodied AI research.
https://aihabitat.org/
MIT License
2.45k stars 404 forks source link

Adding sensors to HITL app #2384

Open ambervg opened 2 months ago

ambervg commented 2 months ago

I am working on a simple HITL app and for this I am trying to add some more sensors via a config.yaml file. The final goal would be to export some video's from these sensors. I have been looking at default_structured_configs.py from the habitat-lab repo on how to structure that config.yaml file.

Below I will post a snippet. I also made a dummy project repo, which I will link here.

habitat:
  simulator:
    agents_order: ['agent_0', 'agent_1']
    agents:
      agent_0:
        sim_sensors:
          head_rgb_sensor:
            type: 'HabitatSimRGBSensor'
            height: 256
            width: 256
            position: [0.0, 1.25, 0.0]
            orientation: [0.0, 0.0, 0.0]
            uuid: 'head_rgb'
      agent_1:
        sim_sensors:
          head_rgb_sensor:
            type: 'HabitatSimRGBSensor'
            height: 224
            width: 224
            position: [0.0, 1.25, 0.0]
            orientation: [0.0, 0.0, 0.0]
  task:
    lab_sensors:
      compass_sensor:
        type: "CompassSensor"

However, when I try to access the sensor information through sim.get_sensor_observations().keys(), I don't see my newly created sensors. Is there something else I need to do so I can retrieve observations through these sensors? Looking at this tutorial it seems like adding them to the config should be enough.

Please let me know whether what I am asking is unclear, so I can elaborate. If anyone could help me out with this question, that would be greatly appreciated! I did checkout the documentation/tutorials before posting this issue, so I hope that I have not overlooked an obvious answer from there.

ambervg commented 1 month ago

System information & package versions

OS: Ubuntu 22.04.1
Architecture: x86_64
habitat-sim version: 0.3.0
habitat-lab version: 0.3.0
habitat-hitl version: 0.3.0

In case this is relevant :smiley:

0mdc commented 1 month ago

Hey @ambervg,

Thanks for the detailed info! Sensors are indeed tricky to use in the current state of the stack. Something may be overwriting or incorrectly propagating your configuration. I'll have a dive in the code and let you know what I find.

ambervg commented 1 month ago

@0mdc Thanks. Would love to hear when you have any thoughts! Also tagging @rutadesai.

ambervg commented 1 month ago

It might be valuable for me to expand a little bit about what exactly I want to do. The goal is to export observations from different angles in the simulation. These angles can be static in the environment or moving with the agent (e.g. zoomed-out first person perspective).

My assumption was that sensors would be the easiest way to do that, but I am also open to alternative approaches.

@0mdc @rutadesai

0mdc commented 1 month ago

Hey @ambervg

Apologize for the late reply. This stack is also obscure to me. You probably need to include more information to your lab config.

These observation keys probably have to be defined. Note that the agent_N_ substring is expected to come from agents_order.

habitat:
  gym:
    obs_keys:
      - agent_0_head_rgb_sensor
      - agent_1_head_rgb_sensor

There are some utilities that may be helpful if you want quick observations, such as the peek function in this class: https://github.com/facebookresearch/habitat-lab/blob/main/habitat-lab/habitat/sims/habitat_simulator/debug_visualizer.py#L142

0mdc commented 1 month ago

@zephirefaith Since you had to go through these hoops recently, would you know if anything else needs to be defined for this to function as expected in the current stack?