Closed lokeshn011101 closed 1 year ago
@lokeshn011101
I appreciate your interest in our project!!
The original paper says that GradInversion assumes that the batch size is smaller than the number of classes to estimate the labels. If you want to specify a bigger batch size, one way is optimize_label=True
, which might be more unstable.
It would be really helpful if you have time to create a pull request to add an exception to handle this case.
For example:
https://github.com/Koukyosyumei/AIJack/blob/main/src/aijack/attack/inversion/gradientinversion.py
def group_attack(self, received_gradients, batch_size=1):
"""Multiple simultaneous attacks with different random states
Args:
received_gradients: the list of gradients received from the client.
batch_size: batch size.
Returns:
a tuple of the best reconstructed images and corresponding labels
"""
if (batch_size > self.y.shape) and (not self.optimize_label):
raise ValueError(f"batch size (= {batch_size}) must not be greater than the number of classes (= {self.y.shape})")
group_fake_x = []
group_fake_label = []
group_optimizer = []
Thanks for the clarification, it definitely helped! Will make a PR soon.
Hi, I am using the below code to try AIJack.
It throws the below error. But when I set
batch_size
to any value less than or equal to 10, I don't get this error. Can anyone tell me what's wrong with this?