ligengen / EgoGen

[CVPR 2024] Official code for EgoGen: An Egocentric Synthetic Data Generator
https://ego-gen.github.io
Apache License 2.0
70 stars 4 forks source link

Issue running the Ego-perception driven motion synthesis on a generic indoor mesh #4

Closed jedi644 closed 1 month ago

jedi644 commented 1 month ago

Hey authors, Its a really great an amazing work. Also really cool to have the code released. I was trying to use the egogen code to generate Ego-perception driven motion synthesis in a room. It works fine for the replica room.

I was trying to run it on a different mesh, and it isnt very clear how to use the api, and how we provide a generic indoor room mesh.

Can you please help with the following:

  1. Can you please explain how can we use the following script to run on a generic mesh:

    python -W ignore crowd_ppo/main_ppo.py --resume-path=data/checkpoint_87.pth --watch --deterministic-eval

  2. The API and inputs required for this script

It would be immensely helpful to get a little more insight about it.

ligengen commented 1 month ago

Hi @jedi644,

Thanks for your interest and question!

For a generic mesh, it requires:

  1. make sure the mesh is z_up and the floor height is zero as here.
  2. calculate walkable regions with create_navmesh as here. Change your new scene path here.
  3. convert the navmesh to a shapely polygon, which is required for ego-sensing calculation. You may use this script:
    
    import shapely
    from shapely.geometry import Polygon
    from shapely import union_all
    from shapely.plotting import plot_polygon
    import matplotlib.pyplot as plt
    import numpy as np
    import pdb
    import os
    import pickle
    import trimesh

mesh = trimesh.load('/mnt/scratch/kaizhao/datasets/replica/room_0/navmesh_tight.ply', force='mesh') verts = mesh.vertices[:, :-1] faces = mesh.faces polygons = [] for poly in faces: polygons.append(Polygon([verts[poly[0]], verts[poly[1]], verts[poly[2]], verts[poly[0]]])) polygon = union_all(polygons) with open('./shapely.pkl', 'wb') as f: pickle.dump(polygon, f) plt.clf() plot_polygon(polygon) plt.savefig('./shapely.jpg')

4. sample [start and goal positions](https://github.com/ligengen/EgoGen/blob/81a955d4da06082b5b165d708dba811c89c13a8b/motion/exp_GAMMAPrimitive/utils/environments.py#L56). You can use `trimesh.sample.sample_surface_even(navmesh_tight, 1)[0]`. You can find how to use this function [here](https://github.com/ligengen/gamma_interaction/blob/354c610ec62a51029c9270e4a461e5710d5e26f5/test_navmesh.py#L635). Note: try to sample start and goal locations in the safe region (not too close to the boundary):
 ```python 
from shapely import Point
while True:
    start = trimesh.sample.sample_surface_even(navmesh_tight, 1)[0]
    if scene_shapely.contains(Point(start[0][:2]).buffer(0.22)):
        break

while True:
    finish = trimesh.sample.sample_surface_even(navmesh_tight, 1)[0]
    if scene_shapely.contains(Point(finish[0][:2]).buffer(0.22)):
        break

and they are not too far or too close (for actual walking distance 2.8 - 3.5m is good). Our pretrained model is trained with max_depth=13. You can set it bigger if you want to train the model to walk longer.

With these, you should be good. You may check your implementation by set visualize=True in here.

A side note, data/checkpoint_87.pth is only trained in room0, and as a result, it may perform poorly in other scenes. You could finetune it with more diverse scene meshes. You will need to calculate SDFs for other watertight scene meshes (replica scenes are not watertight).

Hope this helps:)

jedi644 commented 1 month ago

Hi @ligengen Thanks for the prompt reply. Really appreciate it. I have a follow up question. This is regarding the sdf. I presume from your response above, the sdf is not required to run the code on another room. Its required for training only i.e. If I were to finetune on other rooms? Please correct me if I am wrong

ligengen commented 1 month ago

Hi @jedi644,

During the evaluation, SDF is required to calculate the penetration reward. See here.

If you do not care about accurate penetration detection in new scenes like Replica during your evaluation, it is fine without the SDF. You can simply comment out lines related to the penetration reward to make it runnable.

jedi644 commented 1 month ago

Hi @ligengen, Thanks again for the response.

I was trying to estimate sdf's for meshes and passing them to the code, but was facing some issues. Can you please comment on the following:

  1. Which library did you use to generate sdf's for replica meshes? Is it https://github.com/wang-ps/mesh2sdf or https://github.com/marian42/mesh_to_sdf
  2. Also, can you comment a bit on the data structure storing the sdf (i.e. the pickle file that's being read by the code to generate penetration reward) and the format in which the input is being passed here (i.e. the vertices_w being passed in)

Thanks again.

ligengen commented 1 month ago

Hi @jedi644, I used wang_ps. For your convenience, here is the script:

import trimesh
import mesh2sdf
import numpy as np
import pdb
import pickle

filename = '/mnt/scratch/genli/datasets/replica/room0_watertight.ply'

size = 512

mesh = trimesh.load(filename, force='mesh')

# normalize mesh
vertices = mesh.vertices
bbmin = vertices.min(0)
bbmax = vertices.max(0)
center = (bbmin + bbmax) * 0.5
scale = 2.0 * .8 / (bbmax - bbmin).max()
vertices = (vertices - center) * scale

print('center: ', center)
print("scale: ", scale)
print(vertices.min(), vertices.max())

sdf = mesh2sdf.compute(vertices, mesh.faces, size)
with open("room0_sdf.pkl", 'wb') as f:
    pickle.dump({"sdf": sdf, "size": size, "center": center, "scale": scale}, f)

You may need to check if the calculated SDF is clean by visualizing it (if the scene mesh is not watertight the result SDF can be too noisy to use):

import trimesh
import numpy as np
import skimage

voxels = np.load("room0_sdf.pkl", allow_pickle=True)
vertices, faces, normals, _ = skimage.measure.marching_cubes(voxels['sdf'], level=0)
mesh = trimesh.Trimesh(vertices=vertices, faces=faces, vertex_normals=normals)
mesh.show()

vertices_w is not the SDF, but the world position of human body vertices.

jedi644 commented 1 month ago

Thanks @ligengen for helping with my queries. I could run it on a few scenes. I am closing the issue now. I will run some more experiments. If I get stuck again I will reach out/ reopen this issue. Thanks again.