AmirSh15 / FECNet

Facial Expression Feature Extractor
https://github.com/AmirSh15/FECNet.git
67 stars 11 forks source link

FECNet

Oct 23, 2022

This module contains code in support of the paper A Compact Embedding for Facial Expression Similarity. The experiment is implemented using the PyTorch framework.

In this repository, I used the implementation of Inception network from timesler and DenseNet from gpleiss

Dependencies

The code was successfully built and run with these versions:

pytorch-gpu 1.2.0
cudnn 7.6.4
cudatoolkit 10.0.130
opencv 3.4.2

Preprocessing Data

For preprocessing, you should download the Google facial expression comparison dataset and extract in 'data' folder. After that, you have to run the 'preprocess.py' file to download images and create the dataset.

Training

I followed the procedure in the main paper to train their network. In this work, their goal is to describe facial expression in a continuous compact embedding space. The network has a backbone of inception network (which was trained to cluster faces) up to inception (4) block. The inception network was trained on VGGFace2. The weights of inception network is fixed and the output feature maps feed to a DenseNet consists of a regular convolution layer with 512 filters and 1*1 kernel size. It also has a Dense block with 5 layers and growth rate of 64 (the whole parameters are based on the main paper). It follows by two fully connected layers with 512 and 16 embedding size. For training, run the 'FECNet.py' file.

The following pretrained FECNet is available:

model pretraining training Traing acc Test acc
inception_resnet_v1 VGGFace2 Google facial expression comparison dataset 75.0 64.3

You can download the pretrained model here

References

If you found this repo useful give me a star!

@inproceedings{vemulapalli2019compact,
  title={A Compact Embedding for Facial Expression Similarity},
  author={Vemulapalli, Raviteja and Agarwala, Aseem},
  booktitle={Conference on Computer Vision and Pattern Recognition (CVPR)},
  pages={5683--5692},
  year={2019}
}