Open miccol opened 2 years ago
Just to mention, as far as I see (but please confirm it)stickBot
has the same kinematic and inertial properties of iCub3, while the difference is simply in the visual. If that is the case, we might not need this intermediate model but directly use stickBot
model.
@mfussi66 can confirm it, but I think that both the laserscanner and the realsense have been added in the stickBot in https://github.com/icub-tech-iit/ergocub-gazebo-simulations/pull/11, then they should be already available
Correct, the sensors have been added in the stickBot, and they can be used. I just need to note that due to the latest head design iterations, they might need to be placed higher by a few millimeters in the urdf. I can update the stickBot as soon as I have the latest CAD model.
Just to mention, as far as I see (but please confirm it)stickBot has the same kinematic and inertial properties of iCub3, while the difference is simply in the visual
Inertial and visual are different but kinematics are the same
Inertial and visual are different but kinematics are the same
If inertial are different, we could need to change some parameters moving from iCub3 to stickBot. Anyway, this investigation is in progress (see https://github.com/ami-iit/component_ergocub/issues/63)
We (HSP) discussed with AMI about the Y1D3 on the autonomous navigation in simulation (MOM here). To simplify intermediate tests, we would like to ask you (or show us how to do it properly) to add the laser scanner and the RealSense into the current iCub3 model. The sensors should be placed in the same position as for the design of ergoCub. In that way, we won't have to make major changes to the current Walking Controller.
Thanks MC
@randaz81 @lrapetti @SimoneMic