I have been working on my own project to stream data to the iClone app as I mentioned in one of my posts. What I have been noticing is that facemesh does not seem to pickup cheek movements and in general not good at picking up adequate facial movement for 3D motion caputure. If I puff my cheeks out I would expect facemesh to pickup that information and change the mesh accordingly but I do not see any sign of this movement being detect. With the 468 keypoints I thought it would be better at detecting these movements.
I have been working on my own project to stream data to the iClone app as I mentioned in one of my posts. What I have been noticing is that facemesh does not seem to pickup cheek movements and in general not good at picking up adequate facial movement for 3D motion caputure. If I puff my cheeks out I would expect facemesh to pickup that information and change the mesh accordingly but I do not see any sign of this movement being detect. With the 468 keypoints I thought it would be better at detecting these movements.