cram2 / cram

CRAM software stack
http://cram-system.org
30 stars 34 forks source link

Popcorndemo + Dual Arm Picking and Placing Noetic #259

Closed sunava closed 2 years ago

sunava commented 2 years ago

Integrates the popcorn demo shown here in CRAM. Most work was done by @tlipps closing PR185. Thanks.

Trello Card:Popcorn-demo

This demo uses the pour and slicing plans from this PR https://github.com/cram2/cram/pull/232. Moreover, this PR addressed a problem with manipulation containers, which is further described in the trello card.

Also Integrates Dual Arm Picking and Placing, again most work was done by @tlipps closing PR185. Thanks.

allows robots to pick and place with two arms wont move arms when transporting object with both arms prolog predicate (subset list list -> bool)

Also Integrates Integrated world-state-detecting in perceiving, again most work was done by @tlipps closing PR158. Thanks.

How it works:

Added a counter key to the perceiving designator Implemented a counter variable in fetch which counts the failed perceiving attempts Changed world-state-detecting, now only the type of the object is enough. How the Implementation works Integration world-state-detecting with fetch and deliver In fetch a counter was implemented which counts how often a perception-object-not-found error was thrown. In the Prolog statements which resolve the perceiving designator it will be checked if the given object designator contains a name key. If this is the case occluding objects will be determined. In the perceive plan it will firstly be tried to detect the object using detecting, if this throws an perception-object-not-found error the error will be catched and when there are occluding objects and the counter reached 0 the world-state-detecting designator will be used.

World-state-detecting with only a type using the btr:get-objects-for-type function all objects for the given type will be queried, then all occluding objects will be determined for every object of the given type. If there are it means the object is in the field of view (FOV) of the robot and occluded by other objects. This returns a list of all objects which are occluded but in the FOV of the robot. If the resulting list, of occluding objects, is empty a perception-object-not-found error will be thrown

sunava commented 2 years ago

@hawkina can u check popcorndemo (check iai_maps popcorn branch) and both milestone demos (setting and cleaning) should work out of the box :D

hawkina commented 2 years ago

Tested by running: roslaunch cram_pr2_pick_place_demo sandbox.launch (urdf-proj:with-simulated-robot (setting-demo)) (urdf-proj:with-simulated-robot (cleaning-demo))

and roslaunch cram_pr2_popcorn_demo sandbox.launch (urdf-proj:with-simulated-robot (demo))

It worked nicely :)