YeongHyeon / DGM-TF

TensorFlow implementation of Disentangled Generative Model (DGM) with MNIST dataset.
MIT License
10 stars 0 forks source link
anomaly-detection disentangled-generative-model generative-adversarial-network generative-neural-network mnist-dataset

[TensorFlow] A disentangled generative model for disease decomposition in chest X-rays via normal image synthesis

TensorFlow implementation of Disentangled Generative Model (DGM) with MNIST dataset.

Architecture

Objective Functions

The objective functions (losses) for training DGM [1].

DGM architecture

The architecture of DGM.

Graph in TensorBoard

Graph of DGM.

Problem Definition

'Class-1' is defined as normal and the others are defined as abnormal.

Results

Training Procedure

Losses for training generative components.
Each graph shows adversarial loss, reconstruction loss, and total variation loss sequentially.

Loss graphs in the training procedure.
Each graph shows generative loss and discriminative loss respectively.

Restoration result by DGM.

Test Procedure

Box plot with encoding loss of test procedure.

Normal samples classified as normal.

Abnormal samples classified as normal.

Normal samples classified as abnormal.

Abnormal samples classified as abnormal.

Environment

Reference

[1] Youbao Tang et al. (2021). A disentangled generative model for disease decomposition in chest X-rays via normal image synthesis. Medical Image Analysis. ELSEVIER.