-
Hello, lady:
First, thank your youtube video "Realtime Emotion Analysis Using Keras", it helps me a lot!
Could I ask you some questions about emotion recognition?
I have a problem but I don't know …
-
Hello dear Mediapipe team,
our team found this Google Research Paper [“Blendshapes GHUM: Real-time Monocular Facial Blendshape Prediction”](https://arxiv.org/pdf/2309.05782.pdf) from 11. September …
-
Could you please provide Docker file to create needed environment?
I have downloaded project in the pytorch/pytorch docker, but has to fix some problems inside.
Also it will be useful to get ins…
-
I have tested the Emotion Recognition API with 10 different samples for all the 7 emotions. Out of 70 tests, none of them failed. It was 100% accurate. But this was because I used a particular type of…
-
trained 400 epochs, tried only 3 emotions, but the network still does not want to recognize anything. Shows one emotion, how not to bite the columns only move a little, the main emotion does not chang…
-
### Is your feature request related to a problem? Please describe
Mascots are cute, fun, and add brand recognition.
Who doesn't love the Linux Penguin or the Gradle Elephant?
OpenSearch des…
-
Data being collected to aid in advertising is a very real use that is not represented here.
-
Hi,
I have multiple face in a video I want to make sure that the emotions I am detecting correspond to specific face so the result are not mixed.
Can py-feat do this?
-
## Information
The problem arises in chapter:
* [ ] Introduction
* [X] Text Classification
* [ ] Transformer Anatomy
* [ ] Multilingual Named Entity Recognition
* [ ] Text Generation
* [ ] …
-
Hi, I just ran VGG Face 2 trainer/classifier on the images labeled for 4 emotions, Angry, Happy, Sad and Neutral. Using the split data set I am getting an accuracy of only 25%. Any thought on how I ca…