RobotLocomotion / drake

Model-based design and verification for robotics.
https://drake.mit.edu
Other
3.24k stars 1.25k forks source link

geometry: Should support other mesh formats at run-time (and document existing run-time support)? #14436

Closed EricCousineau-TRI closed 1 month ago

EricCousineau-TRI commented 3 years ago

This is a follow-up issue from here: https://github.com/RobotLocomotion/drake/issues/2941#issuecomment-739915733

Possible formats:

See downstream question: https://stackoverflow.com/questions/65101327/how-to-use-ros-urdf-package-in-drake/65182577?noredirect=1#comment115257216_65182577

in briefly perusing our docs (as of 2020-12-09), I checked a few different places quickly and didn't see any mention of only-OBJ support (relates #13314): https://drake.mit.edu/doxygen_cxx/classdrake_1_1geometry_1_1_mesh.html https://drake.mit.edu/doxygen_cxx/classdrake_1_1multibody_1_1_parser.html#a9b40bb492d33e373b3bfcf5d440a36c2 https://drake.mit.edu/doxygen_cxx/classdrake_1_1geometry_1_1_scene_graph.html https://drake.mit.edu/doxygen_cxx/classdrake_1_1geometry_1_1render_1_1_render_engine.html https://drake.mit.edu/doxygen_cxx/namespacedrake_1_1geometry_1_1render.html#a84f6890eb23c6cba8a7a342bc11fcba3 (RenderEngineVtk) https://drake.mit.edu/doxygen_cxx/namespacedrake_1_1geometry_1_1render.html#aff58ede2e6498518901d8a297a3bd07b (RenderEngineGl)

EricCousineau-TRI commented 3 years ago

FYI @SeanCurtis-TRI I've assigned you for now. Feel free to reassign, or close if you think it's a complete "wontfix" (though it'd still be nice to see the docs be more explicit per #13314).

\cc @sherm1 @joemasterjohn

jwnimmer-tri commented 3 years ago

For starters, I'm in full support of better documentation. We should do that.

But in terms of adding new features, simply saying "support" in the issue title begs the question of what, exactly you'd like us to support. Being more clear about the specific prioritization of possible enhancements would help focus and motivate the work here.


A common, simple goal might only be for visualization (i.e., the illustration role). In that case, the job of SceneGraph could be as simple as ferrying the mesh URI (and/or its resolved filename) between the SDFormat input file and the draw message transmitted to the visualizer. Or at worst, streaming the file's raw bytes to the visualizer, in case the visualizer doesn't have local filesystem access. Drake itself might never actually have to parse or understand the illustration mesh format.


A more ambitious goal would be to use the mesh for sensor simulation (i.e., the perception role). In most cases, though, the implementation of a given sensor's simulation will use some third-party rendering toolkit, and that library will already have some footprint of what it can and can't handle, which is outside of Drake's control. (Even Drake's "GL renderer" could be viewed in the same way, where it only supports Wavefront.)

We should document the feature matrix of a given renderer, but I don't know that we should try to route around it by transmogrifying meshes ourselves on the fly -- doing so tends to hit a lot of sharp edges, where you'll usually want an artist to get involved. We could still work with the toolkits' upstream to try to get them to have first-party support for the formats we want.

We could also imagine implementing a "ChooseBestRenderer" dispatch (like we have with ChooseBestSolver for mathematical programs), where we programmatically select which renderer to use based on the capabilities required by the scene.


Obviously, the other possible final goal would be to accept more mesh formats for the proximity role. Since we are implementing those geometric queries ourselves, it is up to us to work with the kinds of meshes that users want to collide.

Our mesh collision capabilities are a bit nascent anyway, so I'm not sure how urgent it is to support more mesh input formats here. If we did, though, we would only need to extract the shape information from the mesh format, not anything visual, texture, normals, etc. -- that could be much easier. I am hesitant to say we'd even need that much, though. It would be very unusual to use a robot's (or maniupland's) unadulterated visual meshes for proximity computations. Inevitably, we end up using CAD tools to rework the original visual mesh into something more suitable for proximity. Asking whoever does that to save as Wavefront instead of DAE, or converting it offline after the fact, doesn't seem overly onerous.

RussTedrake commented 1 year ago

I hit this issue immediately when showing a friend how to open their PR2 model in Drake. The standard PR2 urdf ships with .stl assets. In the pr2_simplified.urdf that we ship in Drake, we have manually changed all mentions of .stl to .obj. Users shouldn't have to do that.

In #14219 we added stl2obj, which allows the build system to do that conversion automatically. But currently pydrake users don't get to use it.

Request: pydrake users should be able to load assets described with .stl in their urdf files.

jwnimmer-tri commented 1 year ago

@RussTedrake to be clear, the robot URDFs in question are these ones?

From a quick skim, it looks like visual meshes are DAE, and collision meshes are STL.

Is that correct?

RussTedrake commented 1 year ago

@jwnimmer-tri -- short answer is "yes". The repo I'm working with has a version of this file with xacro already run, but I believe it is otherwise almost identical. Certainly it has the visual meshes as DAE and collision meshes as STL.

Currently, attempting to parse it into MultibodyPlant/SceneGraph fails with

RuntimeError: ProximityEngine: expect an Obj file for non-hydroelastics but get .stl file (.../models/ltamp/pr2_description/meshes/base_v0/base_L.stl) instead.
jwnimmer-tri commented 1 year ago

Got it, thanks.

The ideal answer here is probably that Drake can parse STL meshes (and maybe DAE) as collision objects, directly, from our Parser class (and its helper functions in SceneGraph, etc).

A possible stop-gap is that when the user provides a STL mesh in a model file, they get an exception message pointing them to a new https://drake.mit.edu/troubleshooting.html section that explains how to convert the mesh by hand (and probably check it into their source control), using an easy-to-install and robust tool, probably from pip. Possibly a pydrake module if there isn't any tool in place. Something like 2-3 lines of shell code they can copy-paste to overcome their problem.

RussTedrake commented 1 year ago

I agree on both counts.

jwnimmer-tri commented 1 year ago

At least in ModelVisualizer, converting the STL meshes using trimesh offline looks okay:

image

Do you have any acceptance test for whether any particular obj-conversion is good enough for contact simulation? Or is the goal only to get it to the point of loading, with the expectation that users would soon-thereafter replace the contact geometry with something better than these meshes?


Repro

(0) Virtualenv on Ubuntu 22:

python3 -m venv env
env/bin/pip install --upgrade pip
env/bin/pip install drake trimesh xacro

(1) Check out PR2, run xacro, and rewrite the SDFormat text to use OBJ instead of STL:

rm -rf pr2_common && git clone --filter=blob:none https://github.com/PR2/pr2_common.git
cd pr2_common
git reset --hard 1.13.1
cd pr2_description
DESC=$(pwd)
find . -name '*.xacro' | xargs perl -pi.orig -e \
  's#\$\(find pr2_description\)/#'$DESC'/#g; s#\$\(arg KINECT.\)#false#g; s#\.stl"#.obj"#g;'
cd ..
cd ..
env/bin/xacro \
  pr2_common/pr2_description/robots/pr2.urdf.xacro > \
  pr2_common/pr2_description/robots/pr2.urdf

(2) Create OBJs using trimesh, and run ModelVisualizer:

import glob
from pydrake.all import *
import trimesh
stl_paths = glob.glob("pr2_common/pr2_description/meshes/**/*.stl")
print(f"Converting {len(stl_paths)} meshes...")
for stl_path in stl_paths:
    mesh = trimesh.load_mesh(stl_path)
    obj_text = trimesh.exchange.obj.export_obj(
        mesh,
        include_normals=False,
        include_color=False,
        include_texture=False,
        return_texture=False,
        write_texture=False,
    )
    obj_path = stl_path[:-3] + "obj"
    with open(obj_path, "w") as f:
        f.write(obj_text)
print("... done")
viz = ModelVisualizer()
viz.parser().package_map().AddPackageXml("pr2_common/pr2_description/package.xml")
viz.AddModels("pr2_common/pr2_description/robots/pr2.urdf")
viz.Run()

(To get it to load, I also need to change by hand some of the inertias to be physically valid. There were also a lot of warnings about unsupported transmission elements and such.)

RussTedrake commented 1 year ago

Thank you!

I would expect that any obj conversion should be lossless wrt the geometry, no? So I hadn't been worried about the quality of the obj conversion?

I think simplifying the mesh (via convex decomposition, or otherwise) is a separate issue. It might be important! but it's non-trivial.

Re: the inertias, I do think that's a real issue. I've opened #18773 to track it.#18773

RussTedrake commented 1 year ago

Is your plan to basically have a python method (run once, not on every parse?) that will effectively update the asset file to be compliant?

jwnimmer-tri commented 1 year ago

I would expect that any obj conversion should be lossless wrt the geometry, no? So I hadn't been worried about the quality of the obj conversion?

The state of the art in open source python mesh conversion tools for wavefront data is pretty dire. I wouldn't necessarily assume they work well, or at all. I want to look at some simulated contacts to understand better what we have. It might be the case that sphere-izing the STL (rather than convering to OBJ) is a better approach.

For example, I don't recall if we care about polygon winding or normals. If I turned on normals output in trimesh, the result in meshcat looked terrible.

Also, meshlab crashes if I ask it to open the trimesh output above, with or without normals.

Is your plan to basically have a python method (run once, not on every parse?) that will effectively update the asset file to be compliant?

I wouldn't go so far as to say I have a plan yet. That idea is one fair near-term option, though.

Mostly I'm just trying to learn about the current state of affairs, and to see what's possible.

jwnimmer-tri commented 1 month ago

Folding STL and DAE into the same ticket is unhelpful for tracking and taking action.

We have https://github.com/RobotLocomotion/drake/issues/19109 filed for DAE support.

We have https://github.com/RobotLocomotion/drake/issues/19408 filed for STL support.

Closing this as a duplicate of those two.