VIS-VAR / LGSC-for-FAS

Learning Generalized Spoof Cues for FaceAnti-spoofing
MIT License
225 stars 56 forks source link

Could you share a train log? #5

Closed yongqyu closed 4 years ago

yongqyu commented 4 years ago

When I train on oulu Proro1, I find the loss hard to converge. The minimum ACER is near 4%. Could you share your train.log or provide any advice?

Thanks.

haochengV commented 4 years ago

Thank you for your question. Actually, LGSC is easy to train end-to-end and converges in 20 epochs on all datasets used in the paper. Have you load the pre-train model? And do you evaluate the model by using the magnitude of the spoof cue map? We will upload the train log latter.

yongqyu commented 4 years ago

Thanks for the quick response.

In fact, I re-implemented your model based on the PyTorch. The model used pre-trained weights for only the backbone(resnet18). The magnitude of the spoof cue is averaged and subtracted one like your code.

Thank you.

wizyoung commented 4 years ago

@yongqyu I guess you reimplemented LGSC not correctly. I'm the 4th author of LGSC paper which was finished at my internship. Actually the model loss was much easier to converge than we expected. And the reported results in paper were not in much fine-tuning.

I happened to retrain the model on OULU few months ago. Here is the training log 0117_margin05_train.log You see the training total loss converges fast and smoothly. And I chose a checkpoint at about epoch 18 on protocol 1 during test.

My retrain obtained better results than reported in paper on all four protocols. So I think maybe you reimplemented LGSC not correctly. Please check the details of this repo and our paper carefully.

CHNxindong commented 3 years ago

When I train on oulu Proro1, I find the loss hard to converge. The minimum ACER is near 4%. Could you share your train.log or provide any advice?

Thanks.

@yongqyu hello, I am also training on oulu, can you share how to modify train.py? and do you know how to organise data format corresponding to oulu.py? thank you! thank you! thank you!!!