robotflow-initiative / rfuniverse

Robot physics simulation and RL platform based on Unity / 基于Unity的机器人物理仿真和RL平台
https://robotflow.ai/
Apache License 2.0
106 stars 12 forks source link

hwo to achieve teleoperation #10

Open nutsintheshell opened 7 months ago

nutsintheshell commented 7 months ago

In the paper, it shows that it can be achieved that people can reconstruction their hand motion in sim environment by wearing data gloves. However, I don't find how to achieve this in the doc. Could someone tell me how to achieve it? Thanks.

ghzh26252 commented 7 months ago

Teleoperation can be implemented based on the SteamVR Plugins with data gloves or any other VR hardware's SDK, it is decoupled with RFUniverse and fully compatible with it, so it's not put into the open-source project.

Here is a simple workflow for you to implement teleoperation with HTC VIVE device after you get familiar with the secondary development workflow of RFUniverse:

  1. Create a Unity project, import RFUniverse SDK and SteamVR Plugin according to the documentations.
  2. Create an empty scene according to the documentation of RFUniverse, turn off the camera in the scene, add CameraRig prefab of SteamVR Plugin.
  3. Refer to the Interactions_Example scene that comes with SteamVR Plugin, add an interactive object to the scene that can be interacted with VR handles, and add BaseAttr component of RFUniverse to the object, don't forget to set the ID.
  4. Add a URDF robot to the scene according to the documentation of RFUniverse, manually enable NativeIK in the Inspector panel, and don't forget to set the ID.
  5. Configure the pyrfuniverse environment according to the documentation of RFUniverse, create your python script, get the pose information of the interactive object through the pyrfuniverse API, and set it as the IKTarget of the robot.
  6. At this point, the robot in the simulation has been driven, if you need to drive the real robot, you can get the state of the robot in the simulation through the pyrfuniverse API and send it to the real robot.

This is just a simple workflow, the specific implementation needs to be adjusted according to your VR hardware. If you have any questions during the actual development, you can continue to ask here.