zhan-xu / RigNet

Code for SIGGRAPH 2020 paper "RigNet: Neural Rigging for Articulated Characters"
GNU General Public License v3.0
1.34k stars 181 forks source link

Train RigNet without skin prediction #44

Closed Tigran1983 closed 2 years ago

Tigran1983 commented 3 years ago

Hello dear Zhan-Xu again,

I am with another approach at this time:

I want to train your network without skinning prediction. For this I used your script to extract rig_info and obj from original fbx, but with commented skinning parts in your original script. So I got obj, and rig_info files. Also I couldn't find any command line automated way to remesh my models, so I used the original models also instead remeshed ones. Then I ran pretrain_attn script, that's gone normally, but I cannot run compute_volumetric_geodesic.py, it stops on the first model id and starts increase RAM up to 32 GB, with doing nothing.

Now please let me know what you think about my approach, and do you see anything to help me with this? (with automated remesh of models and with not working volumetric_geodesic)?

If I'll do that, I'll tell you about results.

Thanks and Best Regards, Tigran

zhan-xu commented 3 years ago

compute_volumetric_geodesic.py will cost too much memory and time if your mesh has many faces... In quick_start.py line 229-236, I have a down-sample/mesh simplification step. That could save much time and memory. Maybe you can also use simplify_quadric_decimation from Open3D (http://www.open3d.org/docs/0.7.0/python_api/open3d.geometry.simplify_quadric_decimation.html) to remesh your models?

Tigran1983 commented 3 years ago

Thanks for your response, I tried but didn't get result with open3d simplify_quadric_decimation, it is getting failed with "[Open3D WARNING] Write geometry::TriangleMesh failed: unknown file extension", but no worries, my colleagues will make remesh for me, then I'll try data preprocessing and train with models without skin info. Then I'll lt you know about results. If you have thoughts about any changes related data without skin info, please let me know.

Thanks and best regards, Tigran

Tigran1983 commented 3 years ago

Hello dear Zhan-Xu,

Now I have new dataset, with original obj, rig_info and remeshed obj and rig_info. I ran compute_pretrain_attn.py and got it passed, but I can't run compute_volumetric_geodesic.py. I am running on google cloud with V100, but gpu isn't used during this script evaluation. And also script is dumping on first model. My remeshed characters are about 5000K vertices, as you said. Please tell me, how you could run compute_volumetric_geodesic.py??

Best Regards, Tigran

Tigran1983 commented 3 years ago

347 Traceback (most recent call last): File "geometric_proc/compute_volumetric_geodesic.py", line 173, in
one_process(dataset_folder, start_id, end_id)
File "geometric_proc/compute_volumetric_geodesic.py", line 127, in one_process pts_bone_visibility = calc_pts2bone_visible_mat(mesh_remesh, origins, ends) File "geometric_proc/compute_volumetric_geodesic.py", line 64, in calc_pts2bone_visible_mat locations, index_ray, index_tri = RayMeshIntersector.intersects_location(origins, ray_dir + 1e-15) File "/opt/conda/lib/python3.7/site-packages/trimesh/ray/ray_triangle.py", line 107, in intersects_location **kwargs) File "/opt/conda/lib/python3.7/site-packages/trimesh/ray/ray_triangle.py", line 66, in intersects_id triangles_normal=self.mesh.face_normals)
File "/opt/conda/lib/python3.7/site-packages/trimesh/ray/ray_triangle.py", line 244, in ray_triangle_id triangle_candidates = triangles[ray_candidates]
numpy.core._exceptions.MemoryError: Unable to allocate 54.9 GiB for an array with shape (818403510, 3, 3) and data type float64

Tigran1983 commented 3 years ago

Dear Zhan-Xu,

Is it possible successful run of volumetric_geodesic with remeshed models with 5000 vertices, but 55 Gb memory needed for remeshed models with 6000 vertices? I though that on google instance with 8 cores I could use remeshed models with 6000 vertices. I am not right?

zhan-xu commented 3 years ago

Hi sorry for the late response. The geometric_proc/compute_volumetric_geodesic.py script doesn't have down-sampling step, so it is possible to cost much memory.

  1. If you don't predict skinning weights, maybe volumetric geodesic distance is not necessary? This is just for sorting bones to vertices.
  2. With Open3D simplify_quadric_decimation, you said the error is "unknown file extension". Could you post how you called this function? I think down-sampling is very important to make the script runable. (I used a cluster from the university, but later found this actually cost too much memory, so I think down-sampling is somehow necessary. Original code is just for reproducing the same results as in the paper).
Tigran1983 commented 3 years ago

Hi dear Zhan-Xu,

I am working hard during last month, but couldn't get my approach to the end. I have an offer for you. I have about 400 fully rigged models with face, hands and legs. I can provide them to you, you can use dataset as your own, and build your code for this dataset, because I am not familiar with 3D modeling and it takes from me very big effort to understand each issue during data preprocessing, generation. Please if you want, I'll send you this dataset, and you from your side will make preprocessing with your scripts, and provide to me the final stage, after which I can start train of your networks. Also please pay attention that my approach is follows: I want to train networks to predict rig without skinning, and after rig prediction we'll use blender autoskinning option. With my data I think your networks could predict face, hands and legs rig. If you agree and can spend time on this approach, please send me your e-mail, to which I can send my dataset.

Best Regards, Tigran

zhan-xu commented 3 years ago

What is your email address? We can discuss over emails for your questions.

Tigran1983 commented 3 years ago

My e-mail is: tigran_grigoryan83@yahoo.com I am from Armenia.