danielenricocahall / One-Class-NeuralNetwork

Simplified implementation of one class neural network for nonlinear anomaly detection.
MIT License
46 stars 4 forks source link

One-Class-NeuralNetwork

Simplified Keras implementation of one class neural network for nonlinear anomaly detection.

The implementation is based on the approach described here: https://arxiv.org/pdf/1802.06360.pdf. I've included several datasets from ODDS (http://odds.cs.stonybrook.edu/) and the Wine Dataset from UCI (https://archive.ics.uci.edu/ml/datasets/wine) to play with.

Setup

pipenv install . should configure a python environment and install all necessary dependencies in the environment.

Running

Running python kdd_cup.py or python wine.py within your new python environment (either through CLI or IDE) should kick off training on the KDD Cup dataset epochs and generate some output plots.

Testing

Two unit tests are defined in test/test_basic.py: building the model, and the quantile loss test based on example in the paper:

alt text

Execute pytest test to run.

Results

HTTP Dataset

Loss

alt text

Features

alt_text

Wine Dataset

Loss

alt text

Features

alt text

Notes

Based on the objective function we use, the loss is unbounded, meaning there is no real "convergence" - at least on the two test datasets presented here. I've probed the source code, read through the paper several times, and I'm fairly certain the implementation here is accurate. I'm not sure if this is a limitation of the approach, or if I'm missing something. I'd love to hear from anyone who has any insight on this, especially if you apply it to new datasets.