roboticslab-uc3m / questions-and-answers

A place for general debate and question&answer
https://robots.uc3m.es/developer-manual/appendix/repository-index.html
2 stars 0 forks source link

Thoughts about path-planning within roboticslab-uc3m #40

Open jgvictores opened 6 years ago

jgvictores commented 6 years ago

Thoughts about path-planning within roboticslab-uc3m, and considerations on creating a path-planning repository within the organization.

Considerations to be taken into account before blindly doing this:

  1. Path-planning is not required for much of our research. 2D/3D path-planning is mostly required in the presence of obstacles, which is not our current general use case within our current research (this can change with time). Cases where we are doing some path-planning:
    1. [2D navigation] TIAGo navigation, via ROS navigation stack.
    2. [2D navigation] There should be some remanent code in asibot-main ravebot or tasks, where OpenRAVE was used. Should be migrated to openrave-yarp-plugins if it hasn't been done yet.
    3. [3D grasping] There has been some progress with TEO in OpenRAVE in https://github.com/roboticslab-uc3m/teo-grasp for grasping a bottle. This may remain independent or some day be merged into openrave-yarp-plugins.
  2. We already have some trajectory generation in kinematics-dynamics, which is a good place for it (not exactly path-planning, does not take obstacles into account).

As seen, the above candidates already have their place. Therefore, my recommendations are the following:

  1. Not creating a new path-planning repository within this organization, at least for now.
  2. Use this issue to track path-planning developments within this organization.
  3. If results are in Cartesian space, keep close integration with kinematics-dynamics, which treats Cartesian to joint space conversions. Therefore, new issues will potentially arise at kinematics-dynamics too.
jgvictores commented 6 years ago

cc: @PeterBowman @rsantos88

jgvictores commented 6 years ago

If results are in Cartesian space, keep close integration with kinematics-dynamics, which treats Cartesian to joint space conversions. Therefore, new issues will potentially arise at kinematics-dynamics too.

Regarding trajectories in Cartesian space, new related issues for keeping close integration: https://github.com/roboticslab-uc3m/kinematics-dynamics/issues/134 and https://github.com/roboticslab-uc3m/kinematics-dynamics/issues/135

jgvictores commented 6 years ago

Regarding trajectories in joint space, just a reminder that we have some tools in the, well, tools repository. Namely, as commented here:

Specifically, you'll want the PlaybackThread. You can find an example of use at examplePlaybackThread and its corresponding test.

jgvictores commented 6 years ago

Related: https://github.com/ros-industrial-consortium/tesseract (refs: https://rosindustrial.org/news/2018/7/5/optimization-motion-planning-with-tesseract-and-trajopt-for-industrial-applications and https://github.com/roboticslab-uc3m/kinematics-dynamics/issues/155#issuecomment-404444963)

PeterBowman commented 5 years ago

See also: https://github.com/personalrobotics/aikido.

AIKIDO is a C++ library, complete with Python bindings, for solving robotic motion planning and decision making problems. This library is tightly integrated with DART for kinematic/dynamics calculations and OMPL for motion planning. AIKIDO optionally integrates with ROS, through the suite of aikido_ros packages, for execution on real robots.

PeterBowman commented 5 years ago

Not exactly path-planning, but kinda related: https://github.com/robotology/navigation.

jgvictores commented 5 years ago

See also: https://github.com/strands-project/strands_navigation

jgvictores commented 5 years ago

TODO: See https://github.com/roboticslab-uc3m/kinematics-dynamics#similar-and-related-projects, specifically https://github.com/roboticslab-uc3m/kinematics-dynamics/blob/3859656e713d3bda9a10b13157a19b3a8f9a190e/README.md#path-planning-trajectory-generation-and-optimization

PeterBowman commented 2 years ago

I think the demo developed by @elisabeth-ms has some bits of path planning. It's an DL-based object detection app for grabbing stuff with one of TEO's arms.

jgvictores commented 2 years ago

I think the demo developed by @elisabeth-ms has some bits of path planning. It's an DL-based object detection app for grabbing stuff with one of TEO's arms.

Cool, nice catch! I'm totally seeing some OMPL at https://github.com/elisabeth-ms/teo-sharon/blob/cfc3a62270e130d0f3a8a8418c18b1a901508bea/programs/TrajectoryGeneration/TrajectoryGeneration.hpp#L17-L24 in addition to the KDL and FCL code. Thanks!

PeterBowman commented 2 years ago

See also this grasping demo featuring the iCub: https://github.com/robotology/community/discussions/573.

PeterBowman commented 2 years ago

Moar grasping straight from the ongoing Nvidia GTC AI conference (thanks @imontesino):