TadasBaltrusaitis / OpenFace

OpenFace – a state-of-the art tool intended for facial landmark detection, head pose estimation, facial action unit recognition, and eye-gaze estimation.
Other
6.83k stars 1.83k forks source link

integration openface in other projects #409

Closed elahia closed 6 years ago

elahia commented 6 years ago

Hi,

Emotion detection is one of the tasks in my prj that i decided to do it by openface (thanks for lots of time saving). Then in second step i need the outputs of openface, so how can i call (or integrate) openface in my own prj? I'm going to use docker for implementing different modules

other info: i'm using libfreenect2 on ubuntu 14.04 with c++ ,

Thanks in advance

NumesSanguis commented 6 years ago

Talking about good timing. A professor in my lab just made OpenFace working with ZeroMQ (a brokerless messaging library) in real-time: https://github.com/TadasBaltrusaitis/OpenFace/issues/375. Although that only runs on Windows now. But the good thing about ZeroMQ is, is that it can send data over the local network. You can run OpenFace on a Windows machine, and on your Ubuntu machine listen to FACS data (or maby make a Docker image with a Windows server running OpenFace).

You might be interested in FACSvatar, a framework that uses OpenFace's FACS data to animate avatars: https://github.com/NumesSanguis/FACSvatar. Everything in that network is set-up in modules, using ZeroMQ, so it seems your module approach would right fit into there. The documentation will be proper by the end of this month :')

If don't need real-time analysis, then you can do it without Windows if you use FACSvatar's input_facs-from-csv module. I also hope to release these modules as Docker images at some point.

My next step is trying to generate FACS using Deep Neural Networks, so that could also benefit from an emotion detection module ^_^

TadasBaltrusaitis commented 6 years ago

Hi,

There are a number of ways you can integrate OpenFace in your own project, depends if you want to do "online" or "offline" integration.

For offline you could just use OpenFace to process data and output .csv files that are then consumed by other modules.

For online integration you could use the suggestion from @NumesSanguis and use a messaging library like ZeroMQ to communicate between OpenFace and your project. You could also integrate it using various inter process communication tools such as Named Pipes on Windows. Another option is to actually include OpenFace as a C++ library for your project (this would require a reasonable amount of engineering though).

There are many other alternatives as well.

Thanks, Tadas

elahia commented 6 years ago

Dears,

Thank you both so much for introducing ZeroMQ, i'm sure in future it would be so handy in my prj. and sorry for late answer, actually my research turned and now I'm going to test my model on just one emotion, sadness!

so in this new scenario, i know user's affective state but what would be so crucial is the exact amount of his sadness!

So do you know if Openface (or any other software-lib-app etc) can measure the exact value of emotion intensity?

from technical point i need to train my model for each user in offline to be adapted with his personality, and then i have to test it in real time interaction.

Thanks in advance, Elahe

On Mon, Apr 16, 2018 at 10:10 PM, Tadas Baltrusaitis < notifications@github.com> wrote:

Hi,

There are a number of ways you can integrate OpenFace in your own project, depends if you want to do "online" or "offline" integration.

For offline you could just use OpenFace to process data and output .csv files that are then consumed by other modules.

For online integration you could use the suggestion from @NumesSanguis https://github.com/NumesSanguis and use a messaging library like ZeroMQ to communicate between OpenFace and your project. You could also integrate it using various inter process communication tools such as Named Pipes on Windows. Another option is to actually include OpenFace as a C++ library for your project (this would require a reasonable amount of engineering though).

There are many other alternatives as well.

Thanks, Tadas

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/TadasBaltrusaitis/OpenFace/issues/409#issuecomment-381733103, or mute the thread https://github.com/notifications/unsubscribe-auth/ANcDWDZxOARpgWg4CpucMJDuwHhN2E0Sks5tpPqYgaJpZM4TVaC0 .

TadasBaltrusaitis commented 6 years ago

OpenFace does not support emotion recognition, instead it recognizes facial expressions (Action Units). You could however, use the features extracted by OpenFace as input to building an emotion recognition system.

Thanks, Tadas

NumesSanguis commented 6 years ago

@elahia I would argue you cannot get an "exact" value of sadness. There are different theories of emotion, but I don't think anyone successfully quantified an emotion. Some argue this is due to limitations of technology, but do you ever hear a percentage of sad? In daily life you only hear about: a bit sad, sad and very sad.

Another explanation could be that emotions are a social reality, not a physical reality, which means an emotion is always in a person's mind. That means you have as many 'sad's as persons viewing the scene. We can still communicate the feeling though, because our concept of sad for most people is similar. This means you maybe can only measure how much people's concepts match, and not get an accuracy value.

If you're interested in that, look-up: Theory of Constructed Emotion, by e.g. Lisa Feldman Barrett.

elahia commented 6 years ago

@NumesSanguis thank you so much for your explanation, yes i see what you mean and i agree, but i thought by wearable sensors and from heart beet and something like this maybe we can have an approximation of it! Do you know if it is possible?

Cheers, Elahe

On Fri, Apr 20, 2018 at 8:50 AM, NumesSanguis notifications@github.com wrote:

@elahia https://github.com/elahia I would argue you cannot get an "exact" value of sadness. There are different theories of emotion, but I don't think anyone successfully quantified an emotion. Some argue this is due to limitations of technology, but do you ever hear a percentage of sad? In daily life you only hear about: a bit sad, sad and very sad. Another explanation could be that emotions are a social reality, not a physical reality, which means an emotion is always in a person's mind. That means you have as many 'sad's as persons viewing the scene. We can still communicate the feeling though, because our concept of sad for most people is similar. This means you maybe can only measure how much people's concepts match, and not get an accuracy value. If you're interested in that, look-up: Theory of Constructed Emotion, by e.g. Lisa Feldman Barrett.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/TadasBaltrusaitis/OpenFace/issues/409#issuecomment-382998281, or mute the thread https://github.com/notifications/unsubscribe-auth/ANcDWPj9XFg3HLU7U-_449z4HN1av5X8ks5tqYU9gaJpZM4TVaC0 .

NumesSanguis commented 6 years ago

@elahia There a lot of people in computer science who indeed believe in that approach. Following the Theory of Constructed Emotion, however, that still wouldn't be enough. It is still very useful information though, but you would still need to attach context to it.

An example given by Lisa Feldman Barrett in her book “How Emotions are Made: The Secret Life of the Brain” goes something like this: One day she goes on a data with a guy. After that she feels weird in her stomach. Thinking back of her date, she relates this feeling to having "butterflies in her stomach" and thinks she must be interested in him. Later on the feeling gets worse, and it turns out she ate bad food, hence the stomach ache. Her bodily input didn't change, her interpretation however did.

This shows that the data we get from our own body is more like pattern matching. We experienced similar bodily feelings before and we try to relate that to what happened to us. The theory says that bodily input isn't an emotion, until we have given an interpretation to it. This interpretation, however, is person-bound, so there is not ground truth for emotion.

The information can still be used though, because every time we experience an emotion, we probably have similar input from our body to our brain. She calls this introspection.

Another path of thinking is, say we have a machine that can 99% accurately determine a person's emotion. What use is knowing that a person is "sad"? A label is useless from an AI perspective to continue upon (except a more if-then rule statements). However, if we think in concepts, what we mean if we say that someone looks sad is, is that probably that person has lost someone/something dear to him/her. Which gives us an incentive to start a conversation and ask "What's wrong?".

An emotion word is a useful communicative tool when both people have a similar concept.

NumesSanguis commented 6 years ago

A TED talk by her from 2017-12 (18 min): https://www.ted.com/talks/lisa_feldman_barrett_you_aren_t_at_the_mercy_of_your_emotions_your_brain_creates_them

p.s. Sorry for using your issue page for an unrelated discussion to OpenFace.

elahia commented 6 years ago

@NumesSanguis Thanks for inspiring video, i'm sure others will use too

i think we need to retreat in this concept and redefine and reconsider lots of other concepts which needs lots of time and financial supports btw i will try to consider all these worthful concepts in my theses

On Sat, Apr 21, 2018 at 9:22 AM, NumesSanguis notifications@github.com wrote:

A TED talk by her from 2017-12: https://www.ted.com/talks/ lisa_feldman_barrett_you_aren_t_at_the_mercy_ofyour emotions_your_brain_creates_them

p.s. Sorry for using your issue page for an unrelated discussion to OpenFace.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/TadasBaltrusaitis/OpenFace/issues/409#issuecomment-383274324, or mute the thread https://github.com/notifications/unsubscribe-auth/ANcDWLdwKOZhGxnU4f8r5YzZLINH4hg5ks5tqt4pgaJpZM4TVaC0 .

elahia commented 6 years ago

Hey dears,

Can openface return facial features in real time video? like as eye's corner position or lips corner position?

thanks Elahe

On Sat, Apr 21, 2018 at 8:22 AM, NumesSanguis notifications@github.com wrote:

A TED talk by her from 2017-12: https://www.ted.com/talks/ lisa_feldman_barrett_you_aren_t_at_the_mercy_ofyour emotions_your_brain_creates_them

p.s. Sorry for using your issue page for an unrelated discussion to OpenFace.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/TadasBaltrusaitis/OpenFace/issues/409#issuecomment-383274324, or mute the thread https://github.com/notifications/unsubscribe-auth/ANcDWLdwKOZhGxnU4f8r5YzZLINH4hg5ks5tqt4pgaJpZM4TVaC0 .

TadasBaltrusaitis commented 6 years ago

Yes it can, but you will need to tap into the C++ code directly to access that.

elahia commented 6 years ago

@TadasBaltrusaitis dear

i really confused! would you please help me to use openface! actually i installed vs15 and compile the openface. i run openfaceofflin.exe and get my own video by webcam. and also corresponding files of the features in processed folder. then i understand that a matlab version of openface also exist,

so the best for me is getting my own webcam video by running matlab, and then analyses them ONLINE! in matlab. so is it possible? if yes how? Thanks, Elahe

TadasBaltrusaitis commented 6 years ago

The Matlab version is not integrated with a webcam, it is used more for prototyping. It would also be too slow to run on a webcam for online analysis.

For any real-time application C++ version is much more suitable.

elahia commented 6 years ago

@TadasBaltrusaitis yes i see, now i'm working on c++, so far so good, thanks for your support,

elahia commented 6 years ago

@TadasBaltrusaitis Dear Tadas, I tried to get the AU values, as i followed the code i found that by following part of code AU values are written into the CSV

`

    for (std::string au_name : au_names_class)
    {
        for (auto au_class : au_occurences)
        {
            if (au_name.compare(au_class.first) == 0)
            {                   
                output_file << ", " << au_class.second;
                break;
            }
        }
    }

`

then i tried to save the value of au_class.second in a separate array (AUTracker) for my own purpose, so i changed the code as follows:

`

          double AUTracker[18];

    output_file.precision(1);

    for (std::string au_name : au_names_class)
    {
        for (auto au_class : au_occurences)
        {
            if (au_name.compare(au_class.first) == 0)
            {                   
                output_file << ", " << au_class.second;

                if (au_name == "AU01")
                    AUTracker[0] =  au_class.second;
                else if (au_name =="AU02" )
                    AUTracker[1] = au_class.second;
                else if (au_name =="AU04" )
                    AUTracker[2] = au_class.second;
                else if (au_name == "AU05" )
                    AUTracker[3] = au_class.second;
                else if (au_name == "AU06")
                    AUTracker[4] = au_class.second;
                else if (au_name == "AU07")
                    AUTracker[5] = au_class.second;
                else if (au_name == "AU09")
                    AUTracker[6] = au_class.second;
                else if (au_name == "AU10")
                    AUTracker[7] = au_class.second;
                else if (au_name == "AU12")
                    AUTracker[8] = au_class.second;
                else if (au_name == "AU14")
                    AUTracker[9] = au_class.second;
                else if (au_name == "AU15")
                    AUTracker[10] = au_class.second;
                else if (au_name == "AU17")
                    AUTracker[11] = au_class.second;
                else if (au_name == "AU20")
                    AUTracker[12] = au_class.second;
                else if (au_name == "AU23")
                    AUTracker[13] = au_class.second;
                else if (au_name =="AU25" )
                    AUTracker[14] = au_class.second;
                else if (au_name =="AU26" )
                    AUTracker[15] = au_class.second;
                else if (au_name =="AU28" )
                    AUTracker[16] = au_class.second;
                else if (au_name == "AU45")
                    AUTracker[17] = au_class.second;
                break;
            }
        }
    }

`

but the problem is that the values in AUtracker are NOT the same as values in CSV file! So how can i get true values of AU?!

Regards,

elahia commented 6 years ago

Hi,

No idea how can i fix it?

Thanks,

TadasBaltrusaitis commented 6 years ago

You can't just use == for string comparisons, have a look at - http://www.cplusplus.com/reference/string/string/compare/

NumesSanguis commented 6 years ago

but the problem is that the values in AUtracker are NOT the same as values in CSV file! So how can i get true values of AU?!

If I understand correctly, OpenFace does some post-processing after analysing the whole video, hence the accuracy is lower when getting data in real-time?

elahia commented 6 years ago

@TadasBaltrusaitis the problem is in this line:

AUTracker[...] = au_class.second;

if i print au_class.second; it shows something but if i save it in AUTracker (immediately after print command) it shows another value ...

@NumesSanguis yes but i'm trying to fetch AU values exactly at the moment it wants to write into the CSV file, i mean after all analysing...

TadasBaltrusaitis commented 6 years ago

There is actually a secondary post-processing step which runs after all of the data to the file has been written. The .csv file gets over-written after the video is processed.

elahia commented 6 years ago

@TadasBaltrusaitis So the only way to get the data is reading the csv file! Very well, thank you for your response,

TadasBaltrusaitis commented 6 years ago

You can get it live as well, but the AU prediction will not be as accurate due to post-processing. Prediction of all the other features should be identical though.

elahia commented 6 years ago

Yes but i think AU features are the most important features (at least in my prj), so maybe is better to get their exact values, as i checked in my case about 10% of prediction (maybe even lower) was different but its effect on my decision-making module is remarkably high,