-
Dear developer,
right now I am working with RLBench as a framework to collect demonstrations for a DRL-algortihm. Instead of going straight from waypoint to waypoint I am using a "teleoperation too…
-
Thank you for this great work. The ARP model really helps me a lot.
I have a question about the experimental setting on the RLBench dataset. There is a sentence in Section 4.1 of this paper saying,…
-
Hi,I want to produce some RLBench data for my current research, I run this file to only get some png images, why no action data?I think most imitation learning tasks require groundtruth actions, can y…
-
Hi,
I am using RLBenchmark to test a manipulation policy network and have encountered a bug in PyRep:
When a zero action (position: 0, 0, 0; Euler angles: 0, 0, 0) is given with relative_to set …
zwbx updated
7 months ago
-
Thank you for your work! I have successfully reproduced your paper. However, I would like to know how to train and test using only one camera in RLBench. Any guidance would be greatly appreciated!
…
-
I'm using RLbench for generating a dataset and wanted to use the PourFromCuptoCup task for my work. When I run the code it errors out and says this shouldn't happen.
I've included an image of the …
-
Hi, i'm currently trying to run the task_builder.py. My ROS, ROS2 are Noetic, Foxy on Ubuntu 20.04 with CoppeliaSim v4.5.1_rev4. running this in terminal get me this error
(base) quyennt@quyennt:~/R…
-
Thank you for your work! I have successfully reproduced your paper. However, I would like to know how to train and test using only one camera in RLBench. Any guidance would be greatly appreciated!
…
-
Hello, I was trying to install docker image of coppeliasim in this repo. When i insert "docker run coppeliasim-ubuntu18:latest". It's working until this part
![image](https://user-images.githubuserc…
-
I found that the generated depth data from gen_demonstration was quite different from other depth data. Do you think it is a intended result?