Closed osrf-migration closed 9 years ago
Original comment by Jackie K (Bitbucket: jacquelinekay).
Great to hear you're interested in more extensive control over simulation! What are some examples of functionality you'd like to see in the simulator?
We're currently working on a simulation API that allows users to add, remove, and move objects in simulation, get contact information, etc. We expect to have this functionality released in early May. Here is the header file laying out the new structures and functions in the C API, which will be wrapped in .mex files for the release. Can you take a look when you have the chance and see if that's adequate for your needs? If not, can you give us some feedback on what you'd like to see in the simulation API?
Original comment by David Kluger (Bitbucket: DKluger).
Things the new header has that we like:
The ability to interact with non-MPL objects in the VRE (i.e. any functions taking _model or _name as an argument)
Apply forces without object interaction (hxs_wrench, hxs_torque, hxs_force, etc)
Turn gravity on/off with the added ability to do this on an object-by-object basis
API functions not in the header that we would like:
Ability to change object colors and transparency mid-simulation
A function that returns a boolean indicating whether two objects are overlapping and sharing the same volume. This would require "phantom" objects to be rendered in the VRE with no density or mass so an MPL finger could move right through it. This might be easier said than done...
If it is possible to make these phantom objects, add an on/off argument to hxs_add_model for this setting. We would like the option to create these phantom objects even if the overlap function cannot be implemented.
Original comment by Jackie K (Bitbucket: jacquelinekay).
Here's how I would design the functionality you are describing:
/// r, g, b, alpha must be between 0 and 1
/// r, g, b represent the red, green, blue values of the object's color
/// _alpha is the transparency. 1 is opaque, 0 is invisible
hxs_set_model_color(const char *_model, float _r, float _g, float _b, float _alpha);
/// similarly,
hxs_set_link_color(const char *_model, const char *_link, float _r, float _g, float _b, float _alpha);
/// Similar to set_gravity_mode, _collides is 1 if the object collides with other objects, 0 otherwise.
hxs_set_model_collide_mode(int _collides);
These functions would allow you to put objects into "phantom" mode using the following API calls, which you could put in a function of your own:
/// make the model transparent
hxs_set_model_color("ghost_model", 0, 0, 0, 0);
/// disable collisions so that the model passes through all other models
hxs_set_model_collide_mode("ghost_model", 0);
/// disable gravity so that the model doesn't sink due to gravity, otherwise it will fall through the ground
hxs_set_model_gravity_mode("ghost_model", 0);
@hsu and @caguero, what do you think of these extra API calls? I think they are feasible. For the first, we'd need to publish Visual messages with a new material to change the color. For the second one, I would use Collision::SetCollision to toggle collisions.
Original comment by Carlos Agüero (Bitbucket: caguero, GitHub: caguero).
A question about the function that tells you if two objects are sharing the same volume:
Original comment by Steve Peters (Bitbucket: Steven Peters, GitHub: scpeters).
We can use the collide_without_contact
tag in ODE to check contact with "phantom" objects.
Original comment by David Kluger (Bitbucket: DKluger).
Carlos, the "bounding box" option is a suitable solution. It would not be fixed, but could have variable dimensions and origin point to accommodate for sensing overlap with large and small objects.
Ideally, the "bounding box" would be one in the same with the object's volume. So it could take any shape and would be molded to the meshes that make the objects. Again, not sure if any of this is feasible. I should mention that we can do our experiments with the VRE in its current state, but these additions would make visual feedback stronger for subjects in experimental sessions.
Original comment by David Kluger (Bitbucket: DKluger).
On top of color, can other intrinsic parameters of the objects be changed mid simulation? Important ones for us would be compliance, frictional coefficients, and weight.
Original comment by Jackie K (Bitbucket: jacquelinekay).
Can you describe the experiment where you would need to dynamically change the compliance, frictional coefficient, and mass of an object?
You set these the frictional coefficients and mass of an object before runtime by making an SDF model. You can also spawn a model during simulation runtime using the HAPTIX API and specify its properties using SDF there. We plan on releasing more documentation for HAPTIX users in the upcoming release.
Gazebo supports dynamically changing surface friction coefficients and mass of an object via our C++/Linux API. We were not planning on exposing these parameters to HAPTIX users because it's rare and unrealistic that an object would suddenly change its frictional and inertial properties.
It is not yet possible to change the compliance of an object in Gazebo. All objects in Gazebo have uniform stiffness and are not deformable. Supporting deformable objects in Gazebo is a milestone we plan to have completed near the end of July.
Original comment by David Kluger (Bitbucket: DKluger).
Mid simulation changes are mostly a convenience feature for us. For example, we want to see if a user can tell if an object in the environment is lightweight or heavy. We would assign a weight (say 0.5 kgs or 5 kgs) to an object in the environment, have a participant use the virtual limb to pick it up, and tell us if he/she thinks it is heavy or light. Since we as the experimenters know the weight we assigned, we can assign correct or incorrect to each trial. A statistically significant correct/incorrect rate would tell us if we can use our neural interface to allow a prosthesis user to tell the weight of an object. It would be nice if we did not have to reload the simulation for every trial to change the objects weight. If we could change the object's weight mid simulation, then we could run trials very quickly. Otherwise, we would have to reload the simulation for every trial.
Similar experiments could be done where we vary the coefficient of friction to emulate a sticky or slippery surface, or change the compliance to emulate a soft or hard object.
Original comment by Jackie K (Bitbucket: jacquelinekay).
Couldn't you create multiple objects of different masses and move them in front of the user for each trial? We have an API function for moving an object to any position. You could also hide objects that are not in use by setting their transparency level.
Original comment by David Kluger (Bitbucket: DKluger).
That would work. The add/remove models could also be used in place of changing transparency.
Original comment by Jackie K (Bitbucket: jacquelinekay).
Original comment by Jackie K (Bitbucket: jacquelinekay).
Original report (archived issue) by David Kluger (Bitbucket: DKluger).
This might already be doable, but we'd like the ability to write our own API functions to interact with the VRE (starting with C/C++ code, and then create MATLAB .mex wrappers). This would allow us to write simulation-specific functions to help use the VRE for experiments.
Is this doable with the VRE and SDK in their current state? If yes, could a tutorial be written explaining how to do this? If not, could this feature be added to a future release?