liren2515 / DrapeNet

Code for "DrapeNet: Garment Generation and Self-Supervised Draping", CVPR2023
GNU General Public License v3.0
115 stars 7 forks source link

Inference on training samples #20

Open Shanthika opened 2 months ago

Shanthika commented 2 months ago

I'm trying to infer samples from the training samples. While some have perfect retargeting, it seems to be exploding for some samples. The results are attached below. I'm using the same preprocessing and inference code for all samples. Is this the limitation of the method, or could there be some bug in my code? 1903 901 002

liren2515 commented 2 months ago

For the second one, I guess the problem is the arms are too close to the body. In our training, we actually separate arms a bit away from the body using the following code

def separate_arms(poses, angle=20, left_arm=17, right_arm=16):
    num_joints = poses.shape[-1] //3

    poses = poses.reshape((-1, num_joints, 3))
    rot = R.from_euler('z', -angle, degrees=True)
    poses[:, left_arm] = (rot * R.from_rotvec(poses[:, left_arm])).as_rotvec()
    rot = R.from_euler('z', angle, degrees=True)
    poses[:, right_arm] = (rot * R.from_rotvec(poses[:, right_arm])).as_rotvec()

    poses[:, 23] *= 0.1
    poses[:, 22] *= 0.1

    return poses.reshape((poses.shape[0], -1))

Since the network never saw such a pose as fig. 2, it may give you something like that.

Shanthika commented 2 months ago

I'm facing the same issue with bottom garments. Do I separate the legs, too? Is there a code snippet available for the same?

Shanthika commented 2 months ago

I'm evaluating Drapenet with ground truth Cloth3D samples by randomly selecting garments and SMPL sequences. Should I include this rotation for evaluation? Also, should I apply these rotations for all samples, or how do I decide that hands are too close to body?

liren2515 commented 2 months ago

I'm facing the same issue with bottom garments. Do I separate the legs, too? Is there a code snippet available for the same?

We don't do that for legs. Can you show me the image of the pose?

I'm evaluating Drapenet with ground truth Cloth3D samples by randomly selecting garments and SMPL sequences. Should I include this rotation for evaluation? Also, should I apply these rotations for all samples, or how do I decide that hands are too close to body?

In our training, we apply this separation to every pose sample. So you can apply it to all your SMPL sequences.

Shanthika commented 2 months ago

These are some inference samples on bottom garments: 5202 9701 60700

liren2515 commented 2 months ago

I assume you are using the pose sequence from cloth3D, right? Could you compare the body of your pose with the body of the pose at pose-sample.pt? I remember the up-axis is different between cloth3d and cmu data.

Shanthika commented 2 months ago

Hello, I did not pass the top-layer information earlier, while draping the bottom garment. I'm getting the expected draping results now. Thank you for addressing the issues!