robotology-legacy / wysiwyd

What You Say Is What You Did
http://wysiwyd.upf.edu
23 stars 16 forks source link

Integrating IIT peripersonal space in the demo #139

Closed clement-moulin-frier closed 7 years ago

clement-moulin-frier commented 8 years ago

That could be a good starting point to include a value system, e.g. differentiating between good objects (e.g. banana) associated with the small PPS, and bad objects (e.g. spider) with a bigger PPS. The value will be provided by a human (good/bad)

Tobias-Fischer commented 7 years ago

Hey @clement-moulin-frier @matejhof @towardthesea @jypuigbo, Do you think this will happen, or shall we close this issue? Best, Tobi

towardthesea commented 7 years ago

Hi @tobias-fischer We are working on it, so keep it open please :smile:

Cheers, Phuong

matejhof commented 7 years ago

I have just updated a google document summarizing the status: Peripersonal space with drives and object values

There are 3 possibilities:

  1. Margin of safety expansion/shrinking as a whole such as responding to overall state of the agent (e.g. more cautious or more stressed -> expanding the whole PPS as safety margin…)
  2. Modulating PPS response w.r.t. object valence/identity
  3. Modulating internal drives as a function of PPS activations

Nr. 1 and 2 and are implemented and tested (result of work with @jypuigbo). Nr. 3 has been suggested recently by @clement-moulin-frier There, all you need to do is to connect to /visuoTactileRF/pps_events_aggreg:o and based on the magnitudes of PPS activations, you can modulate your drives.

Let me know which of the 3 possibilities you consider best suited for the demo. I'll do my best to support you remotely. @towardthesea should be able to assist on spot at the integration meeting. Good luck!

pattacini commented 7 years ago

Thanks @matejhof We'll discuss this tomorrow and let you know.

jypuigbo commented 7 years ago

Hi @matejhof ! We've discussing a bit and what could be interesting to do and we suggest a mix of 2 and 3:

For the 1st we might have to think a bit more, as the current setup would not benefit at all of the feature. For the pure 3rd, it can be done, but might look just weird unless the drives affected fit this behavior.

I'll do the necessary changes to send the data of the hand to the PPS and to change the value associated to the hand.

What do you think? Does it make sense?

towardthesea commented 7 years ago

Only providing more information for a feasible scenario:

How the robot reacts to the moving hands depending on the modulation that @jypuigbo mentioned above.

Cheers, Phuong

matejhof commented 7 years ago

Makes sense I think! Will be a nice addition with the kinect to track human hands.

As you two probably already now, but just to clarify for everyone, the extent of the PPS is fixed (we're newly using 45 cm with @towardthesea - not sure which representation you're currently using, the old learned files in the repo had only 20 cm and were from skin V1), what is modulated are the activations. These are in turn propagated (in aggregated form) to the react-controller (or previously demoAvoidance) and may trigger the avoidance. Thus, behaviorally it may look as though the PPS expanded (or shrinked) because objects that previously did not trigger any response now do.

Please make also sure you use the right branches: dev for react-control and pps-with-modulations-dev-V2skinAndkinematics for PPS.

Finally, it would be nice if you could produce some graphs (and a video :) ) demonstrating the behavior that we could use for Deliverable 6.3 and eventually for a paper. Please have a look at the Logging section in Reaching in clutter. If you dump the react-control and the kinect data, we should be able to get something out of that eventually.

Let me know how it goes!

Tobias-Fischer commented 7 years ago

Hey @matejhof @towardthesea @jypuigbo, Just checking how this is going ahead. Is it still planned to show this in the demo?

Best, Tobi

pattacini commented 7 years ago

This has become a kind of very last moment integration, because of the difficulties and the complexity of replacing the current motor pipeline (karma) with the new reactive layer that can speak to PPS. It was all expected though 😉

@towardthesea will be on it during this month, let's see how it goes. Probably, we'll be able to show results from a first implementation.

towardthesea commented 7 years ago

Some very first results of integration and testing give us some hopes :smile:

towardthesea commented 7 years ago

Hi @jypuigbo, I wonder if your modules, which modulates stress value of object, i.e negative, positive and neutral, does provide any rpc commands for user to manipulate it. I would like to change the stress level manually to see the effect on PPS and robot behavior consequently. If you can, please provide me as detailed as possible.

Thanks 😄

Tobias-Fischer commented 7 years ago

Hi @towardthesea, The code is in wysiwyd/src/modules/reactivelayer/sensationManager/opcSensation.cpp, lines 186-200. You can change the value where it says right_hand.addDouble(-0.5); //Currently hardcoded threat. Make adaptive. Maybe @jypuigbo can add an RPC command to make it adaptive :)

towardthesea commented 7 years ago

Thanks @Tobias-Fischer I saw those lines but rpc command is much easier to run :smile: Hope @jypuigbo can provide a better solution :smile:

jypuigbo commented 7 years ago

Hi @towardthesea ,

What @Tobias-Fischer was pointing to is to change the value. The value of an object is currently hardcoded and you can see changes in the PPS depending on the position of the object, instead of changing the value itself.

In contrast, the stress level depends on the allostatic controller drives. You can run it with any drive (I suggest running the reactive layer (SensationsManager, BehaviorManager, homeostasis and allostaticController) with --from pps.ini. The script should be ready in wysiwyd/main/app/demos/scripts/PPS_with_valence_and_stress_Test.xml.

If you want to show the changes according to stress, but having them more controlled than with the allostatic controller running, I can guide you by slack/skype to change manually the drive values to increas/decrease stress.

Let me know!

towardthesea commented 7 years ago

Hi @Tobias-Fischer For the hand detection from the Kinect, is it consistent for occluded case, e.g the human 's hand is covered by the icub 's hand?

Tobias-Fischer commented 7 years ago

Hi, what do you mean by "consistent"? The Kinect will do some kind of estimation, but I have no idea how accurate the estimates are in case of occlusions :(. It's best to try for the cases you're interested in. Keep in mind that a random forest is used (at least as far as I know), so there is no guarantee for the same output given the same input. Let me know if that answers your question ..

Best, Tobi

towardthesea commented 7 years ago

For the consistent I meant that if the human hand can disappear (in OPC) when being occluded by the robot hand. However your concern of random forest somehow answers my question.

Thanks!

matejhof commented 7 years ago

Without knowing the details, it seems likely that under occlusion the hand will go undetected eventually. However, since the react-controller is using both visual (from PPS) and tactile (directly from the skin) information, we could think of demonstrating exactly this. The hand would get occluded and lost from Kinect - OPC - PPS, then it will touch the skin, and here the next layer will kick in and trigger avoidance. Something like that?

matejhof commented 7 years ago

btw., guys, thanks to @towardthesea, the documentation of PPS with drives has started here: https://github.com/robotology/wysiwyd/wiki/Peripersonal-space-and-modulation-by-drives-and-object-valence Please use it and expand it! @jypuigbo in particular I guess.

Tobias-Fischer commented 7 years ago

Good progress guys! Looking forward to see this in a couple of weeks. Regarding the wiki: I think agentDetector is missing as mandatory module to run @towardthesea.

Best, Tobi

jypuigbo commented 7 years ago

I've updated and tested properly PPS in iCubBarcelona01 and it works nicely. Way better than before, thanks @matejhof

Next thing will be to test it with the reactive controller, I guess, in order to integrate it in the demo. Let's discuss it with @towardthesea to see what is or is not possible to do :)

There is one issue, though: it seems that visuoTactileWrapper (or some related module) takes control of the head at some point when closing. This can generate motor control conflicts and might explain one of the 'head-bangings' we saw in the last IM (at that point, I didn't know any of the modules was actually sending motor commands). @pattacini @towardthesea @matejhof Do you think that this might create some conflict with reactControl ?

matejhof commented 7 years ago

Here are some notes based on how I inspected the code now. visuoTactileWrapper, more specifically its vtWThread is using IGazeControl. https://github.com/robotology/peripersonal-space/blob/master/modules/visuoTactileWrapper/vtWThread.h

During the demo, how are you sending the objects to PPS? I assume sensationManager? If you look here https://github.com/robotology/peripersonal-space/blob/master/modules/visuoTactileWrapper/vtWThread.cpp#L350 The gaze should only be commanded in case red ball or "arbitrary objects / optic flow" or double touch are used as input objects to PPS. That is, not for sensationManager.

At threadRelease(), there is igaze -> stopControl(); https://github.com/robotology/peripersonal-space/blob/master/modules/visuoTactileWrapper/vtWThread.cpp#L350 So I wonder whether that would cause the head to move. @pattacini ?

Finally, reactControl has a config parameter gazeControl which should be off by default. If on, it would gaze at the target it has and possibly create conflicts - it would need exclusive access to IGazeControl.

Does this help?

towardthesea commented 7 years ago

Hi @matejhof

At threadRelease(), there is igaze -> stopControl(); https://github.com/robotology/peripersonal-space/blob/master/modules/visuoTactileWrapper/vtWThread.cpp#L350

I think you mean this line void vtWThread::threadRelease(). It obviously asks the robot head to to go to "zero pose". I wonder should we still keep this motion or not.

For the gazeControl inside the reactControl, it has been set off by default in configuration file.

pattacini commented 7 years ago

Hi,

These lines are wrong:

igaze ->lookAtAbsAngles(ang);
igaze ->restoreContext(contextGaze);
igaze ->stopControl();

Plese correct them with:

igaze ->lookAtAbsAnglesSync(ang);
igaze ->waitMotionDone();
igaze ->restoreContext(contextGaze);

Anyway, I wouldn't keep this homing. Further, the use of the context might be problematic and interfering with other modules.

General question

Why does PPS control the gaze changing the context? Shouldn't it just use the gaze interface to sense the posture?

matejhof commented 7 years ago

This question is mostly for @alecive - this is his heritage. I opened an issue at PPS repo: https://github.com/robotology/peripersonal-space/issues/35

alecive commented 7 years ago

Disclaimer: I did not read the issue in its entirety.

Why does PPS control the gaze changing the context? Shouldn't it just use the gaze interface to sense the posture?

You're talking about two different modules:

  1. visuoTactileRF senses the posture, and does the PPS stuff. It does NOT need to control the gaze at all.
  2. visuoTactileWrapper is used for tracking objects in 3D and following them as they approach the body. It needs to control the gaze because how can the robot react to an object approaching if it cannot perceive it approaching?

I don't honestly know how your overall software architecture is implemented, and who does what, but obviously if you have some other module controlling the gaze for some reason, the visuoTactileWrapper should not do that any more. Nonetheless, its purpose is tracking, and the fact that it controls the head is expected behavior and not a bug. Feel free to have some configuration parameter to disable the gaze control, though --- but keep it on by default.

@jypuigbo I think that an "head-banging" behavior is kind of difficult to appear with the gaze Interface, which in any case should guarantee somewhat smooth trajectories, so I believe that the culprit may be another module. Again, I don't honestly know how your overall software architecture is implemented, so take my comment with a grain of salt.

pattacini commented 7 years ago

Have a look at https://github.com/robotology/peripersonal-space/commit/dbf675ad384c48786cc4686b45d59062948ece93

alecive commented 7 years ago

:+1:

Tobias-Fischer commented 7 years ago

Can we close this issue? Think it's working :). Well done @towardthesea @jypuigbo

pattacini commented 7 years ago

Actually, very well done! 🏅

matejhof commented 7 years ago

Glad to hear that!