allenai / procthor-10k

The ProcTHOR-10K Houses Dataset
https://procthor.allenai.org/
Apache License 2.0
85 stars 6 forks source link

Controller Error #5

Closed pioneer-innovation closed 2 years ago

pioneer-innovation commented 2 years ago

Hi! I followed the demo code, but the controller returns error message.

My code:

import prior
from ai2thor.controller import Controller

dataset = prior.load_dataset("procthor-10k")
train_dataset = dataset['train']
house = train_dataset[3]
controller = Controller(scene=house)

Error message:

Loading train: 100%|██████████| 10000/10000 [00:01<00:00, 9821.77it/s]
Loading val: 100%|██████████| 1000/1000 [00:00<00:00, 10208.92it/s]
Loading test: 100%|██████████| 1000/1000 [00:00<00:00, 10295.37it/s]
Traceback (most recent call last):
  File "/home/beark/zqf/experiment/procthor/main.py", line 31, in <module>
    controller = Controller(scene=house)
  File "/home/beark/anaconda3/envs/procthor/lib/python3.9/site-packages/ai2thor/controller.py", line 557, in __init__
    event = self.reset(scene)
  File "/home/beark/anaconda3/envs/procthor/lib/python3.9/site-packages/ai2thor/controller.py", line 627, in reset
    scene = Controller.normalize_scene(scene)
  File "/home/beark/anaconda3/envs/procthor/lib/python3.9/site-packages/ai2thor/controller.py", line 618, in normalize_scene
    if re.match(r"^FloorPlan[0-9]+$", scene):
  File "/home/beark/anaconda3/envs/procthor/lib/python3.9/re.py", line 191, in match
    return _compile(pattern, flags).match(string)
TypeError: expected string or bytes-like object
mattdeitke commented 2 years ago

Hi @pioneer-innovation, that syntax is currently only supported on a nightly build of AI2-THOR. Please install it with:

pip install --extra-index-url https://ai2thor-pypi.allenai.org ai2thor==0+391b3fae4d4cc026f1522e5acf60953560235971

This is what is done in the procthor colab.

Let me know if that works :)

abstinentcode commented 2 years ago

Thanks! It works!

mattdeitke commented 2 years ago

Ooh, awesome! Glad to hear it worked :)

Closing the issue for now, but feel free to re-open or create another issue if you run into anything else :)