Closed Richard-wang85 closed 2 years ago
Thanks for your attention. Firstly, you need to crop your own caricature manually to the size that have a similar face proportion as our examples shown. To archieve this, you can use some face alignment algorithm (such as dlib face detection method) to obtain the bounding box of face region, then resize it to 224*224. Secondly, you need to use our pretrained model to test it. In order to finish it, you can follow the code shown in ‘Test with Pretrained Model’ part.
Thank you for your reply! According to your instructions, I have successfully got my tested data, but could you tell me how to get the predicted mesh with texture as showed in your paper(Fig.8). I am looking forward to your reply!
To obtain textured mesh, you need to project vertex of the mesh to the image plane by weak camera projection, then color each vertex by its projected pixel. You can refer to function CalculateLandmark2D defined in "cariface.py".
You can ignore the landmark index selection process in function CalculateLandmark2D, but project each vertex to the 2D plane.
I am sorry that I can not understand you very well! Could you give me more advise or make a more detailed descrition?
I will appreciate it if you could give me an instruction like “READ ME” and thank you for your enthusiastic answer!
To obtain Fig.8 in our paper, you need to color each vertex by its projected color. Our method use weak perspective projection and also estimate weak perspective parameters (scale + euler_angle + trans) as 'cariface.py' shown. So you have to construct your code, you can refer to function CalculateLandmark2D defined in 'cariface.py' to obtain 2d projection point of each 3D vertex. Then use original caricature image to color each vertex (directly set the vertex color as its projection color).
Our method can estimate 3D vertex and weak perspective parameters. If you want to color the mesh, you need to write the code to finish projection to 2D caricature image (please refer to function CalculateLandmark2D), then set the color of each vertex as the color of corresponding projected pixel.
I have got it! Thank you very much! Looking forward to further communication!
Hello! Thank you for your splendid work! But I do not know how to test my own data using your model? Could you give me some advise? Thanks for your reply!