sortofsleepy / ofxARKit

A starting point for openFramworks and ARKit experimenting.
257 stars 34 forks source link

Implement blendShapes and lookAtPoint (gaze) for face detector #54

Open aferriss opened 6 years ago

aferriss commented 6 years ago

Blendshapes were added in 11.3 but I don't think ever made it into the repo. There's quite a few of them listed out here. However, there were some new one's added for 12.0 for the tongue. I don't have an X so unfortunately I can't test these out. I would think you can just pull these out of the faceAnchor object's .raw property but I haven't tried.

There's also a new lookAtPoint property of the faceAnchor that provides gaze direction. You can get individual eye transforms as well.

Since these are both just properties of faceAnchors, I'm not sure how it's best to handle implementing them. It might be nice to provide some constants where you could request a specific shape. Maybe something like:

if(anchor is faceAnchor){
 float tongueOut = getFeature(TONGUE); //returns float between 0 - 1

 float leftEyeBlink = getFeature(LEFT_EYE_BLINK) // 0.0 open, 1.0 closed
}
cwervo commented 6 years ago

Hey @aferriss, thanks for filing this!

Blend Shapes

A quick note: you could actually always access blendShapes using the Objective-C API from Apple (I've been doing this for the last couple months), it looks basically like this:

for (auto & face : processor->getFaces()){
        mouthSmileLeft = face->raw.blendShapes[ARBlendShapeLocationMouthSmileLeft].floatValue
    }

That's super long, and annoying that you have to access the float explicitly, so I'm down with putting a convenient wrapper in ofxARKit! I've been toying with this idea today, when I wrap that up into a method getBlendShape(ARBlendShapeLocation blendShapeLocation) on FaceAnchorObject we get this:

for (auto & face : processor->getFaces()){
        mouthSmileLeft = face.getBlendShape(ARBlendShapeLocationMouthSmileLeft)
    }

I think it's much nicer! Anything you'd like to see changed here? Also, I think accepting a ARBlendShapeLocation is convenient because the people can just use the names defined by Apple you pointed out above.

This is also super convenient because it expands as Apple adds more blend shapes, so the tongue works with this method (presuming iOS 12 obvi.), I can get this blue dot color effect driven by the tongue blend shape (value printed at the bottom of the screen):

tongue_dots_example

Eye Tracking

It's convenient to add this method to FaceAnchorObject:

ofVec3f getLookAtPoint() {
    simd_float3 pt = raw.lookAtPoint;
    return ofVec3f(pt.x, pt.y, pt.z);
}

// Usage: face.getLookAtPoint => (x,y,z)

But I can't quite figure out the math needed to make the eye transforms usable in OF, namely converting between the simd_mat4 Apple uses & glm's mat4 type seems nontrivial. I'll take a look at this another day!

That said, they are easily accessible using:

face.raw.leftEyeTransform // get back the simd_float4x4 matrix from ARKit2
aferriss commented 6 years ago

Hey @AndresCuervo , I think your proposed getBlendShape sounds perfect! Thanks for looking into this.

For the eye tracking. I believe there is already a convert method to go from simd to ofMatrix4x4 and you should be able to use the same convert() function to get it into glm. I believe those functions were pulled from cinder actually, so you can easily drop this into ARUtils I think

const glm::mat4 static inline toGlmMat4( const matrix_float4x4 & mat )
    {
        return convert<matrix_float4x4, glm::mat4>(mat);
    }
cwervo commented 5 years ago

Hey @aferriss I made a project (and just updated it to work with the current master branch of ofxARKit) that uses the eye tracking lookAt, the tongue blendShape (unique to iOS 12), & the toGlmMat4 functions. I think this addresses everything in this issue, take a look & close this issue or comment if you think there's anything missing!