Closed liwfpgz closed 5 years ago
AUs are a way to describe facial expressions. If you want to look at classifying emotions or mental states from the facial expression have a look at some recent survey papers for pointers: "Automatic analysis of facial affect: A survey of registration, representation and recognition" by Evangelos Sariyanidi, Hatice Gunes, and Andrea Cavallaro "Facial Expression Analysis" by Fernando De la Torre and Jeffrey Cohn
Thanks, Tadas
Hi, You found that it is more reliable to recognize some facial expressions from HOG features rather than AUs,When you add the functionality,or give me some advice. Thanks, Wang
How did you get AUs output to some folder? which parameter you used. I did below but could not get AU
C:\OpenFace-master\x64\Release>FeatureExtraction.exe -f "../../videos/Aarti.avi" -outroot "output" -simalign "output" -verbose
Hello there!
Please, how can I get to this visualization?
https://github.com/TadasBaltrusaitis/OpenFace/blob/master/imgs/au_sample.png
What did you use to get the AUs shown like this?
Hey, @kamushadenes.
I think you should check out the WINDOW-GUI branch (I'm not sure).
Feature extraction thingy IIRC
ish
@TadasBaltrusaitis first of all thank you so much for sharing such an amazing work of yours with the world. I am also working on a research project to recognize emotions using the action units, so:
Do you know any dataset that contains the numerical values of action units and the corresponding emotion labels to those action units? Or do I have to use any image or video data and pass it through Openface to get the action units and then label those action units manually to create a training set for the emotion classification?
There are several datasets that come to mind: http://www.pitt.edu/~emotion/ck-spread.htm http://bosphorus.ee.boun.edu.tr/Content.aspx
However, depending on your purposes a more reliable way would be to extract action units using OpenFace on a dataset of your interest and to create your own training dataset, as that is more likely to generalize better to your needs and would include the emotions you would be interested in.
Hi,
Thank you so much for the links and the email. Yes I am using CK+ dataset now for the emotions that I want to recognize.
Best regards, Kamran Ali
On Mon, Dec 24, 2018 at 3:17 AM Tadas Baltrusaitis notifications@github.com wrote:
There are several datasets that come to mind: http://www.pitt.edu/~emotion/ck-spread.htm https://na01.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.pitt.edu%2F~emotion%2Fck-spread.htm&data=02%7C01%7Ckamran%40knights.ucf.edu%7C6ab721873a7a4e080cdc08d669784166%7C5b16e18278b3412c919668342689eeb7%7C0%7C0%7C636812362543174478&sdata=enbeLDhi73qSITco3RMHMWVnzyPFUUJky50h3vsUq1s%3D&reserved=0 http://bosphorus.ee.boun.edu.tr/Content.aspx https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fbosphorus.ee.boun.edu.tr%2FContent.aspx&data=02%7C01%7Ckamran%40knights.ucf.edu%7C6ab721873a7a4e080cdc08d669784166%7C5b16e18278b3412c919668342689eeb7%7C0%7C0%7C636812362543174478&sdata=gT0CIlRaBrWB25CROgWYz%2BpG0Y9UdX3spCnWsN1xgDQ%3D&reserved=0
However, depending on your purposes a more reliable way would be to extract action units using OpenFace on a dataset of your interest and to create your own training dataset, as that is more likely to generalize better to your needs and would include the emotions you would be interested in.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2FTadasBaltrusaitis%2FOpenFace%2Fissues%2F147%23issuecomment-449701255&data=02%7C01%7Ckamran%40knights.ucf.edu%7C6ab721873a7a4e080cdc08d669784166%7C5b16e18278b3412c919668342689eeb7%7C0%7C0%7C636812362543184487&sdata=UBAiVvBCaB0fWkXKS5fHrGfGanGTFyQ%2BwURwUxJlJr4%3D&reserved=0, or mute the thread https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FAr8Gk9B7FE1Q5XEi42tYAcwGplEha3f5ks5u8I2cgaJpZM4Ma-hx&data=02%7C01%7Ckamran%40knights.ucf.edu%7C6ab721873a7a4e080cdc08d669784166%7C5b16e18278b3412c919668342689eeb7%7C0%7C0%7C636812362543184487&sdata=jw72EI%2FOVO00EZm3TLdddJ%2BVtbQHiYYCRXNUOa79RYQ%3D&reserved=0 .
-- Kamran Ali
I got to the AU,How to use AU to classify facial expression ! Thank you!