awslabs / fast-differential-privacy

Fast, memory-efficient, scalable optimization of deep learning with differential privacy
Apache License 2.0
83 stars 11 forks source link

Is the Privacyengine compatible with learning rate schedulers like torch.optim.lr_scheduler? #15

Closed shuqike closed 9 months ago

shuqike commented 9 months ago

Is the PrivacyEngine class compatible with learning rate schedulers in torch.optim.lr_scheduler?

Or is the following code implementation valid in the context of DP (ignoring some details like import modules)?

from fastDP import PrivacyEngine
optimizer = SGD(model.parameters(), lr=0.05)
scheduler = torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma=0.9)
privacy_engine = PrivacyEngine(
    model,
    batch_size=256,
    sample_size=50000,
    epochs=3,
    target_epsilon=2,
    clipping_fn='automatic',
    clipping_mode='MixOpt'
    origin_params=None,
    clipping_style='all-layer',
)
privacy_engine.attach(optimizer)

for ep in range(10):
    for i, batch in enumerate(dataloader):
        loss = F.cross_entropy(model(batch), labels)
        loss.backward()
        if i % gradient_accumulation_steps == 0:
            optimizer.step()
            optimizer.zero_grad()
scheduler.step()

Thank you!

woodyx218 commented 9 months ago

Yes. Any lr scheduler should work. One example is our table2text where linear_schedule_with_warmup is used.