isaac-sim / IsaacLab

Unified framework for robot learning built on NVIDIA Isaac Sim
https://isaac-sim.github.io/IsaacLab
Other
2.3k stars 950 forks source link

[Question] How to set attributes for a visualization marker created using a robot asset #1321

Open VineetTambe opened 4 weeks ago

VineetTambe commented 4 weeks ago

Question

Hey,

I am trying to setup a visualization in isaac-sim in order to visualize the target joint pose configuration being applied to the robot. I see in the documentation that I can create a VisualizationMarker from a robot usd file. But after going through the source code I see that I can only set the position and orientation of the root link via the API's available. How do I go about setting the joint_positions or other attributes for the VisualizationMarker the way I can do for an Articulation object?

I see the following message in the change-log provided in the docs:

0.10.6 (2023-12-19)
~~~~~~~~~~~~~~~~~~~

Added
^^^^^

* Added support for using articulations as visualization markers. This disables all physics APIs from
  the articulation and allows the user to use it as a visualization marker. It is useful for creating
  visualization markers for the end-effectors or base of the robot.

I followed the following Nvidia doc for implementing Visualization Markers [link]

Here is the code snippet for instantiating the VisualizationMarker:

 my_visualizer = VisualizationMarkers(VisualizationMarkersCfg(
        prim_path="/World/Markers/Robot",
        markers={
            "robot_marker": robot_cfg.spawn.replace(visual_material=sim_utils.PreviewSurfaceCfg(diffuse_color=(0.0, 1.0, 0.0))),
        },
    ))

And here is I call it in the simulation loop:

    while True:

        target_robot_pos, target_robot_quat, target_robot_dof_pos = get_target_robot_state()
        my_visualizer.visualize(robots.data.root_pos_w, robots.data.root_quat_w)

        sim.step()
        # update sim-time
        sim_time += sim_dt

VisualizationMarker class only provides the .visualize() method which only takes in a Mx3 position matrix and Mx4 quaternion orientation matrix as inputs.

alxschwrz commented 2 weeks ago

It is currently not possible with the VisualizationMarker class [link], which is based on the UsdGeom.PointInstancer class. The change-log, which you mentioned basically means that you can add distinct visualization markers (e.g.sphere) at specific articulations of the robot (e.g. endeffector). You can load specific usd files as markers, but not articulate the USD through the API.

RandomOakForest commented 5 days ago

Thank you for posting this. Let us know if you still need a workaround. I will tag this for the team to review as a possible improvement in upcoming releases.