ami-iit / bipedal-locomotion-framework

Suite of libraries for achieving bipedal locomotion on humanoid robots
https://ami-iit.github.io/bipedal-locomotion-framework/
BSD 3-Clause "New" or "Revised" License
132 stars 36 forks source link

Add `Humanoid` high-level class #377

Open diegoferigo opened 2 years ago

diegoferigo commented 2 years ago

It's quite long time I talk with people working in different projects about having a single shared entrypoint to use in order to compute common quantities used for software development for humanoid robots. It could either be C++ or Python, maybe the latter would be quicker to prototype, but I think this resource would be very useful also from C++.

Last year, in a project which is still private, I started something similar and I'm pretty sure that many of us have similar functionalities scattered in a dozen of repository. I attach in the details below my script, @dic-iit/dynamic-interaction-control if you have similar classes please link them here so that we can at least assess the minimum amount of functionalities that could be interesting to the whole group.

I'd like to start with an initial brainstorming regarding the desired feature, feel free to update this comment if you have the rights to do it, otherwise I'll try to keep the list updated:

Additional ideas:

humanoid.py This class requires a `scenario.core.Model` but, to remain aligned with BLF structure, it could use a `KinDynComputations` object instead. ```py # Copyright (C) 2020 Istituto Italiano di Tecnologia (IIT). All rights reserved. # This software may be modified and distributed under the terms of the # GNU Lesser General Public License v2.1 or any later version. import numpy as np from shapely.geometry import Polygon from scenario import core as scenario_core from typing import List, NamedTuple, Optional from gym_pushrecovery.utils import conversions class Point2D(NamedTuple): """ A 2D point in space. """ x: float = 0 y: float = 0 class Humanoid: """ Relay class of a :py:class::`~scenario_bindings.Model` providing methods useful for humanoid robots. Note: This class assumes contact with flat terrain. Args: model: The model to wrap. """ def __init__(self, model: scenario_core.Model): if not model.valid(): raise ValueError("The model is not valid") self.model = model def get_foot_center_of_pressure(self, foot_link_name: str, normal: np.ndarray = np.array([0, 0, 1.0]), other_scoped_link: str = "ground_plane::link", cop_force_threshold: float = 1E-5) \ -> Optional[Point2D]: """ Get the local center of pressure of a foot. Args: foot_link_name: The name of the foot link. normal: The normal of the foothold plane. other_scoped_link: The name of the other body in contact. The default value is the ground plane. cop_force_threshold: Threshold on the normal force to skip the active contact. Returns: The 2D world coordinates of the center of pressure if the foot is in contact, None otherwise. """ force_mag = self.get_foot_center_of_pressure_force( foot_link_name=foot_link_name, normal=normal, other_scoped_link=other_scoped_link) if force_mag <= cop_force_threshold: return None # Get the contact wrench wrench = self.model.get_link(foot_link_name).contact_wrench() force, torque = np.split(np.array(wrench), 2) # Compute the CoP wrt the link frame torque_x, torque_y, _ = torque l_cop = np.array([-torque_y, torque_x, 0]) / force_mag # Convert the CoP in homogeneous coordinates l_cop_hom = np.concatenate((l_cop, [1])) # Return it wrt the world frame foot = self.model.get_link(foot_link_name) W_H_L = conversions.Transform.from_position_and_quaternion( position=np.array(foot.position()), quaternion=np.array(foot.orientation())) w_cop = W_H_L @ l_cop_hom return Point2D(x=w_cop[0], y=w_cop[1]) def get_foot_center_of_pressure_force(self, foot_link_name: str, normal: np.ndarray = np.array([0, 0, 1.0]), other_scoped_link: str = "ground_plane::link") \ -> float: """ Get the normal force at the center of pressure. Args: foot_link_name: The name of the foot link. normal: The normal of the foothold plane. other_scoped_link: The name of the other body in contact. The default value is the ground plane. Returns: The force along the normal axis with origin at the center of pressure. """ foot = self.model.get_link(foot_link_name) contact = self._extract_contact_with_body(link=foot, other_scoped_link=other_scoped_link) if contact is None: return 0.0 wrench = self.model.get_link(foot_link_name).contact_wrench() force, torque = np.split(np.array(wrench), 2) force_mag = force.dot(normal) return float(force_mag) def get_support_polygon(self, support_links: List[str], ground_link: str = "ground_plane::link") \ -> List[Point2D]: """ Get the vertices of the support polygon of the humanoid. Args: support_links: The links of the humanoid to consider for calculating the support polygon. ground_link: The scoped name of the other link considered as terrain. It defaults with the ground plane link. Returns: A list of 2D points of the coordinates of the polygon vertices if there are enough contacts, an empty list otherwise. The points are ordered counterclockwise and they represent a ring (the first point is also the last). """ contact_points: List[Point2D] = list([]) for link_name in support_links: contact = self._extract_contact_with_body(link=self.model.get_link(link_name), other_scoped_link=ground_link) if contact is not None: for point in contact.points: contact_points.append(Point2D(x=point.position[0], y=point.position[1])) if len(contact_points) >= 3: polygon = Polygon(contact_points) convex_hull_vertices = polygon.convex_hull.exterior.coords return [Point2D._make(p[0:2]) for p in list(convex_hull_vertices)] return [] def get_support_polygon_centroid(self, support_links: List[str], ground_link: str = "ground_plane::link", ) -> Optional[Point2D]: """ Get the centroid of the support polygon. Note: In simulated models the centroid is calculated from the contact points provided by the physics engine. Args: support_links: The links of the humanoid to consider for calculating the support polygon. ground_link: The scoped name of the other link considered as terrain. It defaults with the ground plane link. Returns: The 2D point of the coordinate of the support polygon centroid if there are enough contacts, ``None`` otherwise. """ convex_hull_vertices = self.get_support_polygon(support_links=support_links, ground_link=ground_link) if len(convex_hull_vertices) == 0: return None centroid_coordinates = Polygon(convex_hull_vertices).centroid.coords[0] return Point2D._make(centroid_coordinates) # =============== # Private methods # =============== @staticmethod def _extract_contact_with_body(link: scenario_core.Link, other_scoped_link: str) \ -> Optional[scenario_core.Contact]: """ Given a model link and the scoped name of another link part of the simulation, get the contact between them. The other link name should be scoped, e.g. ``::``. Args: link: The link belonging the model. other_scoped_link: The scoped name of the other link in contact. Returns: The contact if it exists, ``None`` otherwise. """ if not link.in_contact(): return None for contact in link.contacts(): if contact.body_b == other_scoped_link: assert contact.body_a == link.name(scoped=True) return contact return None ```
S-Dafarra commented 2 years ago

I see the advantage of such high-level class. On the other hand, the devil is in the details, and I think that (given his high potential for large adoption), this has to be discussed properly.

For example, I think that the methods should clearly indicate when they need to communicate to the robot or not. Just to give some examples of possible confusion:

In addition, the definition of this class may potentially limit future applications. For example, we could give the centroid of the support polygon, but what if the feet are not coplanar? In that case, it is not well defined. Hence, I am afraid to train the users to use these kinds of methods without knowing what is actually happening.

From my point of view, rather than a superclass, I prefer having the possibility to construct an architecture by connecting simple elements of which I know the limits and the applicability.

traversaro commented 2 years ago

Without entering in the details of what @S-Dafarra wrote, just a naming bikeshedding: I would avoid having a class named Humanoid, but I would try to capture more the actual class of application, so either Biped<something> if it is indeed limited to something with two legs, or Legged<somethng> if it apply to any legged system. Furthermore, if the class has some assumption (non-point contacts, flat terrain) I would reflect that in the name.

diegoferigo commented 2 years ago

@S-Dafarra yes I totally agree with you. This is a design issue and is meant to trigger the discussion on points similar to those you raised.

I think that the methods should clearly indicate when they need to communicate to the robot or not.

Going more in details on your comment, I'd envision this class to be agnostic from the robot source. My Python implementation is based on scenario.core.Model that currently is only implemented for Ignition Gazebo, but I have a draft for a YARP robot. Similarly, a BLF class could work independently from the robot implementation working an object that implements the interfaces in the RobotInterface folder.

In addition, the definition of this class may potentially limit future applications. For example, we could give the centroid of the support polygon, but what if the feet are not coplanar? In that case, it is not well defined. Hence, I am afraid to train the users to use these kinds of methods without knowing what is actually happening.

Sure, the provided features and limitations must be clearly documented. Though, right now I see a massive duplication for these functionalities, and starting collecting the most common, even if not complete, it's a starting point.

From my point of view, rather than a superclass, I prefer having the possibility to construct an architecture by connecting simple elements of which I know the limits and the applicability.

I like having a unique entrypoint with all the features that could be unified, think about how people use KinDynComputations and how useful it is having a similar superclass for all the features that can be unified. This being said, maybe not all the features could go under the same hat, in this case we can create subclasses that inherit and specialize.

just a naming bikeshedding [...]

Totally open to any clearer naming proposal :wink:

S-Dafarra commented 2 years ago

Going more in details on your comment, I'd envision this class to be agnostic from the robot source

In my opinion, it is important to keep it clear from the code when the communication is happening. This is very important when working with the real platform so that you understand if you are working with "fresh" data or not. Just to reason a little bit more on the resulting code structure. For example, imagine a series of calls like:

humanoid.computeSomething()
humanoid.getThis()
humanoid.computeSomethingElse()
humanoid.setThat()
humanoid.getThose()

From a code like this it is hard to understand when things are read from the robot, computed or commanded to the robot.

I kinda like more the pattern used in game engines

beginScene()
doSomething()
endScene()

beginScene() is where things are initialized, then you do something, and when you call endScene() all the modifications take effect, and the scene is rendered on the window. This is because the drawing part is expensive and you want to control when this is done.

KinDynComputations has a similar pattern. You set the model, allocating memory. Then you set the robot state, after which you can retrieve all the quantities you need. In this case, when the computation is done is a little hidden (for example it updates the kinematics the first time you require a transform), but you know that you have to first set the robot state because that's the only way to specify the joint values.

Hence, my hint would be to define the methods such that it becomes clear that the communication with the robot has to be instantiated, or that you are about to send some commands. Just documenting things is often not enough.

diegoferigo commented 2 years ago

In my opinion, it is important to keep it clear from the code when the communication is happening. This is very important when working with the real platform so that you understand if you are working with "fresh" data or not.

Definitely, two possibilities are either using an approach similar to KinDynKinematics::setRobotState (since the state is well defined) or a more generic <NewHumanoidClass>::update() that uses a configurable callback --that can be a closure binding a simulator in a complex example-- :

// 1. Set explicitly the state
auto obj1 = NewHumanoidClass();
obj1.setState(var1, var2, var3);

// 2. Use a callback
auto cb = [&]() -> NewHumanoidClassState { /*...*/ };

auto obj2 = NewHumanoidClass();
obj2.setUpdateCallback(cb)

// This could be downcasted to the interface after this point
obj2.update()

Just to list two of the first examples that come in my mind.

I kinda like more the pattern used in game engines: [...]

Honestly, I do not really like any approach that require to open and close explicitly a computation block because it is a dangerous pattern that gives too much responsibility to the caller (beyond making error checking / behavioral mistakes checks more difficult). It is also against all the recent push of the language towards RAII-like implementations. Maybe there are drawbacks I'm not seeing right now, but the callback seems a good trade off. It also allows having different states for different implementations, if needed. Then, the ::update method could be generic and part of the interface.

Edit: thinking more about it, moving IO logic (actually, just input) to the callback could even avoid the pattern interface + multiple implementations. Then, the NewHumanoidClass could just work on a struct with plain data.

S-Dafarra commented 2 years ago

This update pattern seems very similar to the advance method we have in Advanceable. If you don't want to use the pattern interface + multiple implementations, a possibility could be to create an advanceable implementation that accepts std::functions in the constructor.

Notice that some parts, like the yarp sensor bridge implementation, are already inheriting from Advanceable (thus it has already the advance method https://github.com/dic-iit/bipedal-locomotion-framework/blob/master/src/RobotInterface/YarpImplementation/include/BipedalLocomotion/RobotInterface/YarpSensorBridge.h#L115).

This goes in the direction of the block-like infrastructure on which @GiulioRomualdi was already experimenting. That infrastructure also allows running different parts at different rates for example (see https://github.com/dic-iit/bipedal-locomotion-framework/blob/master/src/System/include/BipedalLocomotion/System/AdvanceableRunner.h).

diegoferigo commented 2 years ago

This update pattern seems very similar to the advance method we have in Advanceable. If you don't want to use the pattern interface + multiple implementations, a possibility could be to create an advanceable implementation that accepts std::functions in the constructor.

This is a brilliant observation @S-Dafarra, I didn't think this way initially, but the existing design of blf definitely shaped unconsciously my proposal :smile:

In any case, this discussion is a broad brainstorming that I wanted to trigger since I'm seeing code duplicating quite fast. I'm not planning to work on this is the near future. Though, let's keep this discussion in mind and try to keep it updated.