-
Hi,
I am working on a project where I get the time series of the users facial landmarks in real time (from a webapp which uses webcam ) can you please suggest me how can I use the facial landmark tim…
-
Hi!
Thank you for developing this fantastic toolkit! I am using it to detect facial action units and emotion changes in videos featuring only a single person's face, given that there is only one pe…
-
hi, could u tell whats' meaning of the AU, tks!
-
Dear author
I can obtain features about action units when I use FeatureExtraction.exe . I have some questions for you:
1、Can I use these features to do facial expression recognition?
2、 Are these…
-
Hi @ESanchezLozano ,
Is it possible to upload the code of generating the ground-truth AU heatmap?
I am trying to implement your code to generate ground-truth AU heatmap, but did not find it in t…
yrj90 updated
2 years ago
-
I'm looking to create a system that analyses the Facial Action Units from a webcam feed and applies them to a 3D character model in a Unity application in real-time. To use OpenFace for this I either …
-
# [Facial Recognition] Facial Emotion Recognition: Single-Rule 1–0 DeepLearning
- Author: [Carlos Argueta](https://medium.com/@kidargueta)
- Origin: [https://medium.com/@kidargueta/facial-emotion-reco…
-
hi,
can you please confirm that `gh2` is the attention maps?
https://github.com/rakutentech/FAU_CVPR2021/blob/0bfb778526908f36b6136e836d8b382877bacfa4/inference.py#L53
it is of size `(batch_size,…
-
# Overview
As @fire shared in issue [27](https://github.com/NumesSanguis/FACSvatar/issues/27#issuecomment-908193941), Google has released [MediaPipe Face Mesh as a Python library](https://google.gith…
-
I have been trying to figure out how to use this for **inference** and evaluate other datasets without finetuning.
The scripts explain how you can use the model with the extracted features but I h…