Open UPstartDeveloper opened 3 years ago
Engineering Standup: April 10, 2021
Engineering Standup: April 11, 2021
face-api.js
NPM package to be able to detect emotions in real-time via the webcam.Engineering Standup: April 12, 2021
face
object is what can adjust the robot's expressions.loader.load
call and returning it, so that it can be used in the script for the expression tracking.Engineering Standup: April 14, 2021
face
variable from within the loader.load
function.Engineering Standup: April 21, 2021
Engineering Standup: May 29, 2021
face-api.js
model right now, predicting user expressions and activating animations on the robot accordingly. Engineering Standup: May 30, 2021
Previous: opened up PR #24 to keep a record of how the new Roberto model is working.
Today:
robot.js
on line 61.alterExpressions
function to account for the new expressions, so it looked like this:
// C: only change if new detections are different from existing values
const newEmotionValues = [
detections[0].expressions.angry,
detections[0].expressions.surprised,
detections[0].expressions.sad,
detections[0].expressions.happy,
detections[0].expressions.afraid,
];
Blockers: For some reason, this new model is still missing the robot arms and is silver in color (the image will be linked below). Here is the link to the .blend file that the GLTF file was created from, as I am still trying to learn more about exporting .blend files to .glb. Also, this error image appeared on the console, indicating there may be something wrong with the new animation loop:
TypeError: Cannot read property 'morphTargetDictionary' of undefined
at createGUI (robot.js:167)
Next Steps: I think we'll to better understand how to export .blend files to GLTF. One thing that may help is looking that the RobotV2_report.json
file I uploaded (in the same expressionTrackingDemos/models/
folder) that could help us validate if the process I used (clicking the export button in Blender) is working ok or not. Then, for the animation loop, I could try and see what properties are actually accessible in the morphTargetDictionary object.
How do we make sure the colors of a .blend model stay with it when we export to GLTF?
"One other idea to consider that could be quite powerful. Is it possible to extract facial expressions to drive an avatar? Look at the face mesh in the Mediapipe Google example.
Extract a fixed list of facial expressions: happy, sad, curious, surprise, etc. ---> map to predefined face animations. Here is a threejs example (starting point) https://threejs.org/examples/#webgl_animation_skinning_morph