First of all thanks a lot for your open source contributions, then I'm having some problems browsing the code. You set num_datapoints (default-5k) as the number of training data for the target model. When the attack model is trained, a number of samples of attack_size (default-500) are extracted from the training data of the target model as the member training set of the attack model. My doubt is that the population set is composed of the whole training set (CIFAR-5W) and test set (CIFAR-1W) of the pre-trained alexnet (target model). When we sample the same number of samples in the population set as the non-member training set of the attack model, we may draw a subset of the training set in the population set, which is the member training set for the attack model. I think this will generate cheating.
I thought of a workaround. We should retrain the alexnet model. The training set of the retrained alexnet model should be the same as the target model training set in attack_alexnet.py.
Looking forward to hearing from you, thank you!
First of all thanks a lot for your open source contributions, then I'm having some problems browsing the code. You set num_datapoints (default-5k) as the number of training data for the target model. When the attack model is trained, a number of samples of attack_size (default-500) are extracted from the training data of the target model as the member training set of the attack model. My doubt is that the population set is composed of the whole training set (CIFAR-5W) and test set (CIFAR-1W) of the pre-trained alexnet (target model). When we sample the same number of samples in the population set as the non-member training set of the attack model, we may draw a subset of the training set in the population set, which is the member training set for the attack model. I think this will generate cheating. I thought of a workaround. We should retrain the alexnet model. The training set of the retrained alexnet model should be the same as the target model training set in attack_alexnet.py. Looking forward to hearing from you, thank you!