Closed StevenLiuWen closed 2 years ago
Hi,
Currently our method does not support skirts and dresses. The main issue is the way we assign skinning weights, which introduces big discontinuities between the legs. I tried to learn the cloth dynamics of a dress recently, but these discontinuities introduce a big spike in the strain energy and the training gets stuck trying to minimize that term (while ignoring the other losses). This is what it looks like:
We believe that the physics-based loss terms could also be used to learn deformations of garments like this, but to do so, first we would need to rethink the skinning step (Equation 1 from the paper).
Hope this helps!
Hi,
Got it. Thanks for your quick reply. It helps a lot and indeed inspires the next research direction!
Hi! I have noticed that the results for the dress is much better in your repo https://github.com/isantesteban/vto-garment-collisions. Is this because of the human diffusion model? Perhaps the diffusion model addresses the skinning problem of the dress better @isantesteban
Yes, our results using the approach described in Self-Supervised Collision Handling via Generative 3D Garment Models for Virtual Try-On (CVPR 2021) are much better for highly nonrigid garments. This is because in that CVPR 2021 paper we pay special attention to the rigging weights associated with the garment. (we use the diffused model to update rigging weights, etc.). For SNUG, we just keep rigging weigts constant, which is not ideal but the focus of the work was different.
@dancasas Got it, thanks for such a quick response!
@isantesteban Hi~
How did you train the human diffused model in your paper Self-Supervised Collision Handling via Generative 3D Garment Models for Virtual Try-On (CVPR 2021)? I didn't find the training details of that part in your paper.
I think the diffused model can be very useful (e.g. as a standalone plug-in) for other virtual-try-on pipeline.
Sorry for talking about the other repo here, but it seems that you have closed the issues
section in that repo.
You can find details about how we trained the diffused model in the supplementary document (linked in The Files section of the project website you are referring).
@dancasas @isantesteban thanks! I will try to implement that : )
by the way, I have some concerns on the supplementary's table3.
If you are using the same train / val split as your work Learning-Based Animation of Clothing for Virtual Try-On, shouldn't it be 4 test sequences? (as pointed out here https://github.com/isantesteban/snug/issues/5).
Further, I have evaluated the performance of Learning-Based Animation of Clothing for Virtual Try-On on my PC, the avg error(euclidean distance in cm level) on 68 sequences(68=17x4) is 1.21 cm (much lower than reported 2.9cm). I think 1.21cm is more consistent with Fig7 in Learning-Based Animation of Clothing for Virtual Try-On.
I attached some figure below for reference
For the human diffusion model
I have tried to sample 100 points from body surface to the cloth vertices(rest pose, dress). Sampled points are then used to query nearest neighbours in the body, just according to Eqn.4~Eqn.6 in http://mslab.es/projects/SelfSupervisedGarmentCollisions/. But it seems that 100 points have exactly the same nearest neighbour for each vertex in the cloth. In this case, it would be the same as naive lbs.
#!/usr/bin/env python
# coding=utf-8
import time
import numpy as np
from utils import load_motion, create_o3dmesh, load_obj, save_obj, query_closest_vertices, laplacianMatrix
from smpl import SMPLModel
import open3d as o3d
import pickle as pkl
from skin import lbs
from pysdf import SDF
cloth_path = "../meshes/dress.obj"
motion_path = "../motions/dance1.npz"
model_path = "../smpl_models/SMPL_FEMALE.pkl"
smpl_model = SMPLModel(model_path)
dat = pkl.load(open(model_path, "rb"), encoding='latin1')
body_template = np.asarray(dat['v_template'])
body_skinweights = np.asarray(dat['weights'])
body_shapedirs = np.asarray(dat['shapedirs'])
body_posedirs = np.asarray(dat['posedirs'])
body_vertices, body_faces = smpl_model.v_template, smpl_model.faces
motion = load_motion(motion_path, swap_axis=True)
poses, shape, translations = motion['pose'], motion['shape'], motion['translation']
cloth_vertices, cloth_faces = load_obj(cloth_path)
cloth_lapmat = laplacianMatrix(cloth_faces)
# cloth = create_o3dmesh(cloth_vertices, cloth_faces); body = create_o3dmesh(body_vertices, body_faces)
# o3d.visualization.draw_geometries([cloth, body])
closest_indices = query_closest_vertices(body_vertices, cloth_vertices)
cloth_skinweights = body_skinweights[closest_indices]
cloth_shapedirs = body_shapedirs[closest_indices]
cloth_posedirs = body_posedirs[closest_indices]
# obtain dynamic
num_samples = 500
def get_dyna_nn(bv, cv, num_samples=100, uniform=True):
# first query the nearest neighbour
nn = query_closest_vertices(bv, cv)
body_cloth = cv - bv[nn]
# sampling along the
xs = np.linspace(0., 1., num_samples).reshape(1, -1, 1)
sampled_points = bv[nn, np.newaxis] + xs * body_cloth[:, np.newaxis]
# then query the sampled points from body
dyna_nn = query_closest_vertices(bv, sampled_points.reshape(-1, 3)).reshape(-1, num_samples)
if uniform: dyna_nn_weights = np.ones_like(dyna_nn) / num_samples
else: dyna_nn_weights = None
for _dyna_nn in dyna_nn:
if np.unique(_dyna_nn, axis=0).shape[0] > 1:
print(_dyna_nn)
print(dyna_nn)
return dyna_nn, dyna_nn_weights
dyna_nn, dyna_nn_weights = get_dyna_nn(body_vertices, cloth_vertices, num_samples=num_samples)
viser = o3d.visualization.Visualizer()
viser.create_window()
mesh = o3d.geometry.TriangleMesh()
id = 0
for pose, trans in zip(poses, translations):
smpl_model.set_params(pose=pose, beta=shape, trans=trans)
smpl_model.update()
cverts = cloth_vertices + np.einsum('a,bca->bc', shape, cloth_shapedirs) + smpl_model.pose_blendshape[closest_indices]
bverts = lbs(smpl_model.v_posed, smpl_model.global_joint_transforms, smpl_model.weights) + trans
cverts = lbs(cverts, smpl_model.global_joint_transforms, cloth_skinweights) + trans
verts = np.concatenate((bverts, cverts), axis=0)
faces = np.concatenate((body_faces, cloth_faces + bverts.shape[0]), axis=0)
if id == 0:
# set up vertex colors
cloth_colors = np.zeros_like(cverts); cloth_colors[:, 2] = 1.
body_colors = np.ones_like(bverts)/ 3
mesh.vertex_colors = o3d.utility.Vector3dVector(np.concatenate((body_colors, cloth_colors), axis=0))
if id == 0:
mesh.vertices = o3d.utility.Vector3dVector(verts)
mesh.triangles = o3d.utility.Vector3iVector(faces)
viser.add_geometry(mesh)
else:
mesh.vertices = o3d.utility.Vector3dVector(verts)
mesh.triangles = o3d.utility.Vector3iVector(faces)
viser.update_geometry(mesh)
viser.poll_events()
id += 1
Hi researchers,
Thanks for sharing this excellent work. I am a little curious about how the performance of the proposed method on the more loose types of clothes, such as long dresses or skirts?