facebookresearch / habitat-lab

A modular high-level library to train embodied AI agents across a variety of tasks and environments.
https://aihabitat.org/
MIT License
1.83k stars 465 forks source link

Using .glb file in the pointnav #1702

Open alre5639 opened 7 months ago

alre5639 commented 7 months ago

I have gotten the pointnav tutorial working with the following script and downloaded the hm3d dataset. I would like to swap out the example scene 'data/datasets/pointnav/habitat-test-scenes/v1/train/train.json.gz ' one of the scenes from the hm3d dataset. However these scenes are in .glb. format. as stated in issue #1457 .glb files can be used directly in the pointnave type, however when change the datapath to my downloaded .glb i get a error: Not a gzipped file. How can I point the sim to use the .glb file from the hm3d download?

Here is the script from the example and my 2 configuration scripts:

Example.py

import habitat
from habitat.sims.habitat_simulator.actions import HabitatSimActions
import cv2

FORWARD_KEY="w"
LEFT_KEY="a"
RIGHT_KEY="d"
FINISH="f"

def transform_rgb_bgr(image):
    return image[:, :, [2, 1, 0]]

def example():
    env = habitat.Env(
        config=habitat.get_config("benchmark/nav/pointnav/alec_test.yaml")
    )

    print("Environment creation successful")
    observations = env.reset()
    print("Destination, distance: {:3f}, theta(radians): {:.2f}".format(
        observations["pointgoal_with_gps_compass"][0],
        observations["pointgoal_with_gps_compass"][1]))
    # cv2.imshow("RGB", transform_rgb_bgr(observations["rgb"]))
    #for when I am remote we jsut save the image
    cv2.imwrite("RGB.png", transform_rgb_bgr(observations["rgb"]))
    print("Agent stepping around inside environment.")

    count_steps = 0
    while not env.episode_over:
        keystroke = cv2.waitKey(0)

        if keystroke == ord(FORWARD_KEY):
            action = HabitatSimActions.move_forward
            print("action: FORWARD")
        elif keystroke == ord(LEFT_KEY):
            action = HabitatSimActions.turn_left
            print("action: LEFT")
        elif keystroke == ord(RIGHT_KEY):
            action = HabitatSimActions.turn_right
            print("action: RIGHT")
        elif keystroke == ord(FINISH):
            action = HabitatSimActions.stop
            print("action: FINISH")
        else:
            print("INVALID KEY")
            continue

        observations = env.step(action)
        count_steps += 1

        print("Destination, distance: {:3f}, theta(radians): {:.2f}".format(
            observations["pointgoal_with_gps_compass"][0],
            observations["pointgoal_with_gps_compass"][1]))
        cv2.imshow("RGB", transform_rgb_bgr(observations["rgb"]))

    print("Episode finished after {} steps.".format(count_steps))

    if (
        action == HabitatSimActions.stop
        and observations["pointgoal_with_gps_compass"][0] < 0.2
    ):
        print("you successfully navigated to destination point")
    else:
        print("your navigation was unsuccessful")

if __name__ == "__main__":
    example()

alec_test.yaml

# @package _global_

defaults:
  - pointnav_base
  - /habitat/dataset/pointnav: alec_habitat_test
  - _self_

habitat:
  environment:
    max_episode_steps: 500
  simulator:
    agents:
      main_agent:
        sim_sensors:
          rgb_sensor:
            width: 256
            height: 256
          depth_sensor:
            width: 256
            height: 256

alec_habitat_test.yaml

# @package habitat.dataset

defaults:
  - /habitat/dataset: dataset_config_schema
  - _self_

type: PointNav-v1
split: train
data_path: data/datasets/pointnav/hm3d/v1/{split}/kfPV7w3FaU5.glb
aclegg3 commented 7 months ago

Hey @alre5639,

The dataset you are modifying here is actually a PointNavDataset which contains PointNavEpisodes. These are produced by the generator: https://github.com/facebookresearch/habitat-lab/tree/main/habitat-lab/habitat/datasets/pointnav. This generator is fed a configuration file which defines the assets used for the episodes (i.e. the SceneDataset). When you load the episode dataset the simulator will look for the assets which were used to generate those episodes.

TL;DR: You need to run the episode generator or get some HM3D PointNav episodes from elsewhere to load them in a lab environment.

Check the DATASET readme: https://github.com/facebookresearch/habitat-lab/blob/main/DATASETS.md for some pre-generated episodes if you don't want to generate your own.

alre5639 commented 7 months ago

Thank you for the quick response. I am actually not particularly interested in the point to point navigation aspects of this problem, I am really interested in unguided exploration in the downloaded meshes, this example just seemed to be quite close to what I was looking for.

Is there another type rather than the pointnav-v1 I could switch over to allow for exploration in the .glb files, or another example you could point me at for navigating within the .glb files? Thanks again.