soubhiksanyal / RingNet

Learning to Regress 3D Face Shape and Expression from an Image without 3D Supervision
https://ringnet.is.tue.mpg.de
MIT License
815 stars 170 forks source link

Remove the global rotation of the generated model #40

Open qianyunw opened 4 years ago

qianyunw commented 4 years ago

Hi~ Thank you so much for your amazing work! I originally wanted to use the following code to remove the global rotation of the generated model, but because of the limited expressiveness of the parameters, the results are different from the original model. I would like to ask how to calculate directly on the vertex coordinates of the original model?

params = np.load(out_param, allow_pickle=True)
params = params[()]
#pose = np.hstack((params['pose'], np.zeros(15-params['pose'].shape[0])))
pose = np.zeros(15)

pose[6] = params['pose'][3]
pose[7] = params['pose'][4]
pose[8] = params['pose'][5]

expression = np.hstack((params['expression'], np.zeros(100-params['expression'].shape[0])))
shape = np.hstack((params['shape'], np.zeros(300-params['shape'].shape[0])))
flame_genral_model = load_model(config.flame_model_path)
generated_neutral_mesh = verts_decorated(ch.array([0.0,0.0,0.0]),
                                    ch.array(pose),
                                    ch.array(flame_genral_model.r),
                                    flame_genral_model.J_regressor,
                                    ch.array(flame_genral_model.weights),
                                    flame_genral_model.kintree_table,
                                    flame_genral_model.bs_style,
                                    flame_genral_model.f,
                                    bs_type=flame_genral_model.bs_type,
                                    posedirs=ch.array(flame_genral_model.posedirs),
                                    betas=ch.array(np.hstack((shape,expression))),#betas=ch.array(np.concatenate((theta[0,75:85], np.zeros(390)))), #
                                    shapedirs=ch.array(flame_genral_model.shapedirs),
                                    want_Jtr=True)
neutral_mesh = Mesh(v=generated_neutral_mesh.r, f=generated_neutral_mesh.f)
neutral_mesh.write_obj(out_mesh)
print("Saved neutral mesh file to " + out_mesh)

image

qianyunw commented 4 years ago

I also tried this, but it didn't work TAT :

params = np.load(out_param, allow_pickle=True)
params = params[()]
#pose = np.hstack((params['pose'], np.zeros(15-params['pose'].shape[0])))

pose = np.zeros(15)
pose[0] = params['pose'][0]
pose[1] = params['pose'][1]
pose[2] = params['pose'][2]

print(pose)
flame_genral_model = load_model(config.flame_model_path)
bs_type=flame_genral_model.bs_type
posedirs=ch.array(flame_genral_model.posedirs)
v_posed = vertices[0] - posedirs.dot(posemap(bs_type)(pose))

neutral_mesh = Mesh(v=v_posed, f=template_mesh.f)
neutral_mesh.write_obj(out_mesh)
print("Saved neutral mesh file to " + out_mesh)
TimoBolkart commented 3 years ago

What exactly do you want to do? Just output the RingNet prediction without the global rotation?

HaoKun-Li commented 3 years ago

It seems that some parameters in the "pose" will affect the open of the jaw and the rotation of the whole mesh. If we want to remove the global rotation but maintain the shape of the jaw, maybe we should keep the parameters about jaw rotation in the "pose" parameter. Then, set other pose parameters as zero. image

Here is my code:

def make_prdicted_mesh_without_srt(params, flame_model_path):
    pose = np.zeros(15)
    # whole mesh rotation
    # pose[:3] = params['pose'][0:3]
    # jaw rotation
    pose[6:9] = params['pose'][3:]
    expression = np.hstack((params['expression'], np.zeros(100-params['expression'].shape[0])))
    shape = np.hstack((params['shape'], np.zeros(300-params['shape'].shape[0])))
    flame_genral_model = load_model(flame_model_path)
    generated_neutral_mesh = verts_decorated(ch.array([0.0,0.0,0.0]),
                        ch.array(pose),
                        ch.array(flame_genral_model.r),
                        flame_genral_model.J_regressor,
                        ch.array(flame_genral_model.weights),
                        flame_genral_model.kintree_table,
                        flame_genral_model.bs_style,
                        flame_genral_model.f,
                        bs_type=flame_genral_model.bs_type,
                        posedirs=ch.array(flame_genral_model.posedirs),
                        betas=ch.array(np.hstack((shape,expression))),#betas=ch.array(np.concatenate((theta[0,75:85], np.zeros(390)))), #
                        shapedirs=ch.array(flame_genral_model.shapedirs),
                        want_Jtr=True)
    neutral_mesh = Mesh(v=generated_neutral_mesh.r, f=generated_neutral_mesh.f)
    return neutral_mesh

predict mesh: image predict mesh without whole rotation generate from my code: image