Use skip-connection to skip Encoder layer to Decoder layer by concatenation, the framework is based on gamonaly. Impement skip-ganomaly and skip-attention-ganomaly, here use CBAM attention before skip Encoder layer to Decoder
Generator + Discriminator model
pip install -r requirements.txt
below image is the whole network of GANomaly with the "skip connection" skill at the begining of Encoder-Decoder part
below image is the detail of Encoder-Decoder with "skip connection" skill
below image is similar to the Unet-Network, just do CBAM before skip connect to the Decoder layer
Custom Dataset
├── test
│ ├── 0.normal
│ │ └── normal_tst_img_0.png
│ │ └── normal_tst_img_1.png
│ │ ...
│ │ └── normal_tst_img_n.png
│ ├── 1.abnormal
│ │ └── abnormal_tst_img_0.png
│ │ └── abnormal_tst_img_1.png
│ │ ...
│ │ └── abnormal_tst_img_m.png
├── train
│ ├── 0.normal
│ │ └── normal_tst_img_0.png
│ │ └── normal_tst_img_1.png
│ │ ...
│ │ └── normal_tst_img_t.png
python train.py --img-dir "[train dataset dir] or cifar10 or mnist"
--batch-size 64
--img-size 32
--epoch 20
--model "ganomaly or skip-ganomaly or skip-attention-ganomly"
--abnormal-class "airplane"
python test.py --nomal-dir "[test normal dataset dir]"
--abnormal-dir "[test abnormal dataset dir]"
--view-img
--img-size 32
GANomaly: Semi-Supervised Anomaly Detection via Adversarial Training
https://arxiv.org/abs/1805.06725
Skip-GANomaly: Skip Connected and Adversarially Trained Encoder-Decoder Anomaly Detection
https://arxiv.org/pdf/1901.08954.pdf
CBAM: Convolutional Block Attention Module
https://arxiv.org/abs/1807.06521
SAGAN: SKIP-ATTENTION GAN FOR ANOMALY DETECTION
http://personal.ee.surrey.ac.uk/Personal/W.Wang/papers/LiuLZHW_ICIP_2021.pdf