jinhaoduan / SecMI

[ICML 2023] Are Diffusion Models Vulnerable to Membership Inference Attacks?
MIT License
30 stars 4 forks source link

Reconfirmation of Training (Fine-tuning) Details on Stable diffusion #7

Closed zhaisf closed 6 months ago

zhaisf commented 6 months ago

Though you've explained it in issue6, I'd still like to double-check this detail cause I'm confused about the experiement result.

Referring to the official code for stable diffusion fine-tuning and the .sh example, there are two parameters for data transform during fine-tuning, namely args.center_crop and args.random_flip, representing centerCrop / randomCrop, no-Flip / randomFlip respectively.

If using the default parameter settings, it's actually a combination of randomCrop and no-Flip. Here's the code:

# Preprocessing the datasets.
train_transforms = transforms.Compose(
    [
        transforms.Resize(args.resolution, interpolation=transforms.InterpolationMode.BILINEAR),
        transforms.CenterCrop(args.resolution) if args.center_crop else transforms.RandomCrop(args.resolution),
        transforms.RandomHorizontalFlip() if args.random_flip else transforms.Lambda(lambda x: x),
        transforms.ToTensor(),
        transforms.Normalize([0.5], [0.5]),
    ]
)

I'd like to know which parameter combination you used?

I conducted experiments on various data-augmentation combination based on your methods (COCO dataset, 2500/2500 split, 150,000 steps, as described in your paper).

The experimental results align with yours only for randomCrop+ no-Flip (ASR/AUC: 0.8334/0.9105). And centerCrop +no-Flip yields higher results (ASR≈0.90). On the other hand, randomCrop+randomFlip results in much lower performance (ASR≈0.75).

So, did you use the combination of randomCrop and no-Flip?

Thank you for your time, and I look forward to your response!

jinhaoduan commented 6 months ago

We utilized the default parameter settings for data augmentation in fine-tuning. In this case, it should be randomCrop + no-Flip.

zhaisf commented 6 months ago

I see. Thanks a lot !