aimagelab / novelty-detection

Latent space autoregression for novelty detection.
MIT License
196 stars 60 forks source link

I find that there seems to be no training code here. Can you provide training code? #1

Closed 17764591637 closed 5 years ago

17764591637 commented 5 years ago

I find that there is no training code here. Can you provide training code? thx very much!

DavideA commented 5 years ago

Hi and thanks for the interest.

To train the model, you should minimize the LSALoss in novelty-detection/models/loss_functions/lsaloss.py

You should do something like

opt.zero_grad()

x_r, z, z_dist = model(x)
self.loss(x, x_r, z, z_dist).backward()

opt.step()

where x is a normal training example and opt is your favorite torch optimizer.

Hope this helps, D

17764591637 commented 5 years ago

Thank you very much for your reply, I am very interested in the self-encoding method you proposed.

17764591637 commented 5 years ago

Good morning, I am bothering you again. I trained the network last night. The parameters are as follows: Data: MNIST, BATCHSIZE: 64, Optimization: Adam, Epoch: 400. After 400 training sessions, the loss dropped from 500 to 167. I would like to ask how many epochs can train the model or how much loss needs to be dropped to terminate the iteration?thanks!

DavideA commented 5 years ago

Hi,

In our experiment we trained it for 200 epochs. Alternatively, you can try to monitor the reconstruction error on a validation set to pick your checkpoint.

Best, D

GinGinWang commented 5 years ago

Hi, I am replicating your experiment's result. Could you tell me the parameters of your Adam optimizer? Another question is about to monitor the reconstruction error on a validation set to pick checkpoint, why not monitor the value of novelty score?

Thanks! Best, Gin