BerkeleyAutomation / fog_x

Apache License 2.0
12 stars 4 forks source link

TypeError: string indices must be integers while running example script #11

Closed J-HowHuang closed 7 months ago

J-HowHuang commented 7 months ago

Describe the bug TypeError: string indices must be integers while running example script

To Reproduce Steps to reproduce the behavior:

  1. install from source code poetry add ../fog_x -E rtx
  2. run example script
    
    import fog_x

🦊 Dataset Creation

from distributed dataset storage

dataset = fog_x.Dataset( name="demo_ds", path="~/test_dataset", # can be AWS S3, Google Bucket! )

🦊 Data collection:

create a new trajectory

episode = dataset.new_episode()

collect step data for the episode

episode.add(feature = "arm_view", value = "image1.jpg")

Automatically time-aligns and saves the trajectory

episode.close()

🦊 Data Loading:

load from existing RT-X/Open-X datasets

dataset.load_rtx_episodes( name="berkeley_autolab_ur5", additional_metadata={"collector": "User 2"} )

🦊 Data Management and Analytics:

Compute and memory efficient filter, map, aggregate, groupby

episode_info = dataset.get_episode_info() desired_episodes = episode_info.filter(episode_info["collector"] == "User 2")


**Expected behavior**
A clear and concise description of what you expected to happen.

**Screenshots**

python examples/example_fogx_collector.py I 2024-04-29 22:36:24,783 dataset.py:104] Dataset path: /home/ubuntu/test_dataset I 2024-04-29 22:36:24,784 polars_connector.py:144] Prepare to load table demo_ds loaded from /home/ubuntu/test_dataset/demo_ds.parquet. I 2024-04-29 22:36:24,795 polars_connector.py:152] Table demo_ds loaded from /home/ubuntu/test_dataset/demo_ds.parquet. I 2024-04-29 22:36:24,813 db_manager.py:293] Closing the episode with metadata {} 2024-04-29 22:36:25.753912: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used. 2024-04-29 22:36:25.758487: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used. 2024-04-29 22:36:25.826837: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instruct ions in performance-critical operations. To enable the following instructions: AVX2 AVX512F FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. 2024-04-29 22:36:27.131030: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT 2024-04-29 22:36:27.799667: W external/local_tsl/tsl/platform/cloud/google_auth_provider.cc:184] All attempts to get a Google authentication beare r token failed, returning an empty token. Retrieving token from files failed with "NOT_FOUND: Could not locate the credentials file.". Retrieving token from GCE failed with "FAILED_PRECONDITION: Error executing an HTTP request: libcurl code 6 meaning 'Couldn't resolve host name', error detai ls: Could not resolve host: metadata.google.internal". I 2024-04-29 22:36:28,023 dataset_info.py:599] Load dataset info from gs://gresearch/robotics/berkeley_autolab_ur5/0.1.0 I 2024-04-29 22:36:28,575 logging_logger.py:49] Constructing tf.data.Dataset berkeley_autolab_ur5 for split None, from gs://gresearch/robotics/ber keley_autolab_ur5/0.1.0 I 2024-04-29 22:36:28,576 dataset.py:381] train Traceback (most recent call last): File "/home/ubuntu/FogX-Store/examples/example_fogx_collector.py", line 20, in dataset.load_rtx_episodes( File "/home/ubuntu/.cache/pypoetry/virtualenvs/fogxstore-nGESZMNh-py3.10/lib/python3.10/site-packages/fog_x/dataset.py", line 385, in load_rtx_e pisodes for step in tf_episode["steps"]: TypeError: string indices must be integers



**Desktop (please complete the following information):**
 - OS: Ubuntu 22.04 LTS
 - Poetry: 1.8.2
 - Python: 3.10.12

**Additional context**
Add any other context about the problem here.
KeplerC commented 7 months ago

commit 1c6e2f8 fix this issue. It's caused by not specifying the split in tfds.

I do see #13 if running the example from the repo. needs more look. but your issue is resolved