Closed marcilzakour closed 10 months ago
Hi Marsil,
This should already be possible, albeit a bit hacky. The Meshes
class accepts a uv_coords
and path_to_texture
argument. The path_to_texture
loads the texture from a file, which might not be what you want, but in the initializer it stores the loaded image as self.texture_image
, which you can override with the texture that you want to display. I.e. something like this:
mano_vertices, mano_faces = mano_model(...)
mano_meshes = Meshes(mano_vertices, mano_faces, uv_coords=html_uv_map)
mano_meshes.texture_image = ... # your texture
mano_meshes.show_texture = True # just to make sure the texture is actually being drawn
Would that solve your problem? We can look into passing not just the path to the texture but the texture directly into the Meshes
class.
Hi Manuel,
This way I can create a mesh with a texture. For example, I tried the following
mano_meshes = Meshes(
vertices=mano_vertices,
faces=mano_faces,
)
mano_meshes.uv_coords = tex_uvs
mano_meshes.texture_image = tex_image
mano_meshes.has_texture = True
mano_meshes.use_pickle_texture = True
mano_meshes.show_texture = True
viewer.scene.add(mano_meshes)
With this approach, I can add the texture and render it. However, the mapping looks wrong:
I assume it is because the uv_coords are per-vertex, and using the mano_faces
to sample the faces' textures will result in a wrong mapping it should instead use the tex_faces
with the tex_uvs
.
It looks like the per-face uv coordinates are not computed when the mesh is created or added to the scene. Even before rendering here the texture faces are the mesh faces.
Hi Marsil,
I see - we'll look into it, shouldn't be a big problem. Do you happen to be able to share a code snippet that shows how you currently do this with pytorch3d or open3d? But I guess it's just simply using different texture faces as you described above.
Best Manuel
Hi Manuel,
Yes exactly, the UV coordinates for the mesh faces are indexed independently from the mesh vertices. Here is a short snippet:
vertices = ... # numpy float (778, 3)
faces = ... # numpy int64 (1538, 3)
vertex_uvs = ... #numpy float (907, 2)
faces_uv_index = ... # numpy int (1538, 3)
faces_uvs = vertex_uvs[faces_uv_index] # numpy float (1538, 3, 2)
texture_map = # numpy float (1024, 1024, 3)
num_faces = faces.shape[0]
# Pytorch3d
# The texture map should be flipped upside down for pytorch3d
pytorch3d_texture = TexturesUV(maps=[torch.Tensor(texture_map[::-1,:,:].copy())], faces_uvs [torch.LongTensor(faces_uv_index)], verts_uvs=[torch.Tensor(vertex_uvs)])
pytorch3d_mesh = Meshes(verts=[torch.Tensor(vertices)], faces=[torch.LongTensor(faces)], textures=pytorch3d_texture)
# visualization
pytorch3d_io.save_mesh(pytorch3d_mesh, path="output/hand_mesh_pytorch3d.obj", include_textures=True)
# Open3D
open3d_mesh = o3d.geometry.TriangleMesh()
open3d_mesh.vertices = o3d.utility.Vector3dVector(vertices)
open3d_mesh.triangles = o3d.utility.Vector3iVector(faces)
open3d_mesh.triangle_uvs = o3d.utility.Vector2dVector(faces_uvs.reshape(-1,2))
open3d_mesh.triangle_material_ids = o3d.utility.IntVector([0] * num_faces)
open3d_mesh.textures = [o3d.geometry.Image(texture_map)]
# visualization
open3d_mesh.compute_vertex_normals()
open3d_mesh.compute_triangle_normals()
o3d.visualization.draw_geometries([open3d_mesh])
Result for pytorch3d: <img src="https://github.com/eth-ait/aitviewer/assets/13512271/e0399db9-5610-4ed1-9474-645eabf0ed2d" width="200" height="200"> Result for open3d:
Kind Regards, Marsil.
Hi, sorry for the delay.
We added a new utility function to compute a set of vertices that shares the same indices for all vertex attributes (e.g. positions and uvs) such that it can then be used to create a Meshes class with the current implementation.
We discussed adding an option to have a different set of indices for positions and UVs but decided against it for now, the main reason is that for rendering with OpenGL we still need to internally convert to a unique set of indices, and we prefer this conversion happening outside the Meshes
class to keep things simpler in its implementation.
If you want to test this new function now you can clone the repository and switch to the dev
branch, install the package with pip install -e .
and do something like this:
from aitviewer.utils import expand_vertex_attributes
new_faces, (new_vertices, new_vertex_uvs) = expand_vertex_attributes((vertices, vertex_uvs), (faces, faces_uv_index))
meshes = Meshes(new_vertices, new_faces, uv_coords=new_vertex_uvs)
For a quick test you can also copy the function directly, you can find it's body here: https://github.com/eth-ait/aitviewer/blob/b6e902ac9d4f9304671898760feac0b9aeab1809/aitviewer/utils/utils.py#L358-L386
for this to work you also need to add the import from trimesh.visual.texture import unmerge_faces
.
We haven't tested this with the HTML dataset yet, so if you find any issues with this let us know!
Hi,
thank you for the quick update. Just tried it and it works fine with HTML as well!
I think the issue could be closed.
Kind Regards, Marsil.
Hi,
Thanks for your work! I am using MANO with HTML to generate the hand texture map. The HTML UV map does not match the MANO vertices default mapping. It assumes 907 vertices. This feature is supported in frameworks like Pytorch3d and Open3D but it looks like Pyrender does not support it. Is there a way to render the hand texture in aitviewer?
Kind Regards, Marsil.