This project is based on the FER2013 dataset and is an implementation to try and detect different facial emotions using a Tensorflow implementation. Our primary usecase was to try and detect the faces of online meetings so that presenters can guage the audience reactions and get a live general sentiment on their audience.
This project was made possible with the contributions of the following people:
https://github.com/cweien3008
https://github.com/InitialCnotD
https://github.com/Jamessukanto
https://github.com/Thiggz
https://github.com/ytan101
FER2013 (https://www.kaggle.com/datasets/msambare/fer2013) consists of 7 classes (angry, disgust, fear, happy, neutral, sad, surprise)
In order to scope down the project to either positive, neutral or negative emotions, surprise was not considered.
git clone https://github.com/ytan101/workplace-emotion-detector.git
pip install -r requirements.txt
This implementation is done with Streamlit serving the webapp.
src/webcam_feed
streamlit run app.py
To run:
src
python -m src.screencast.screen_capture
Press esc when done to quit the window
src
called models
and load your pre-trained Tensorflow models or download one from the below links. src\models\MODEL_NAME
once created. (Change MODEL_NAME
to something appropriate)
src/model_architectures
to perform training on your custom dataset.src/data
and ensure it has train and validation folders with subfolders for each dataclass as we are using Tensorflow's flow_from_directory
API.
constants.py
file within the src
folder which can be freely modified.src/haar_cascades
contains different Haar cascades used for face detection. You may choose to add your own Haar cascade .xml file and point to it accordingly.