h2r / ros_reality

Connect a ROS-enabled robot to Unity
MIT License
132 stars 42 forks source link

How to use this package #5

Closed git-hatano closed 5 years ago

git-hatano commented 5 years ago

I think your work is amazing!! I want to use this package. I could run ros_reality_bridge.launch without problems, but I do not understand how to use this package. Please teach me how to use this package.

ericrosenbrown commented 5 years ago

Thank you for your interest in using ROS Reality!

To use this package, you'll need to first follow the installation instructions on the READM.MD. Have you done all of the installation instructions (i.e: have you also installed Unity and downloaded the Unity repo and opened the project yet?)

Also, do you have a Baxter robot, or are you using a different robot?

git-hatano commented 5 years ago

Thank you so much for your reply.

Yes, I have a Baxter robot! I have already done all of installation instructions. After install, How do I start a project? And what is the setting up?

ericrosenbrown commented 5 years ago

Once you've downloaded both the ROS Package and the Unity package, you will:

1) run ros_reality_bridge.launch on your ROS computer, which it sounds like you've already done. You can verify this works by echoing out the /ros_unity topic and making sure the data is streaming properly. If you don't have a kinect, you will want to set the argument launch_kinect to be false. 2) On the VR computer, make sure SteamVR is setup and that your Vive is all setup/calibrated. 3) Open up the Unity project on your VR Computer, and launch any of the scenes (I'd suggest doing the positon control scene). 4) after opening the unity scene, click on the websocketclient gameobject, and change the websocket ip address to match the ip address of your ROS computer/port. 5) After that, you can click play on the unity scene, and you should at least see the Baxter URDF model be built and have the same transforms as your real baxter. You should also see on the ROS computer that's running the launch file that a client has connected, this is good. If you have a kinect setup, you should also see a point cloud streaming into the scene. If your kinect is calibrated, you should see the kinect pointcloud be overlaid from the perspective of the kinect, otherwise you'll need to manually move the kienct gameobject to have the pointcloud data be where you want it to be

At this point, you're all done, and you should be able to control the robot using the vive hand controllers by holding in the gripper buttons and moving the controllers around, and whenever the real robot moves, the virtual one should update as well.

If you have any questions about this process, please feel free to point out where you're specifically having issues and we'd be happy to help :)

git-hatano commented 5 years ago

Thank you very much for your kind response. As you taught me, I would like to implement from now on!