Is the PrivacyEngine class compatible with learning rate schedulers in torch.optim.lr_scheduler?
Or is the following code implementation valid in the context of DP (ignoring some details like import modules)?
from fastDP import PrivacyEngine
optimizer = SGD(model.parameters(), lr=0.05)
scheduler = torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma=0.9)
privacy_engine = PrivacyEngine(
model,
batch_size=256,
sample_size=50000,
epochs=3,
target_epsilon=2,
clipping_fn='automatic',
clipping_mode='MixOpt'
origin_params=None,
clipping_style='all-layer',
)
privacy_engine.attach(optimizer)
for ep in range(10):
for i, batch in enumerate(dataloader):
loss = F.cross_entropy(model(batch), labels)
loss.backward()
if i % gradient_accumulation_steps == 0:
optimizer.step()
optimizer.zero_grad()
scheduler.step()
Is the PrivacyEngine class compatible with learning rate schedulers in torch.optim.lr_scheduler?
Or is the following code implementation valid in the context of DP (ignoring some details like import modules)?
Thank you!