Implement a wrapper of torch.Tensor for Paillier Encryption (use python-pallier)
Implement MainTask class for Paillier Encryption
The current implementation has the following limitations:
supports only average as the aggregation.
~active party cannot train its local model since it does not possess the secret key.~
Specifically, the current implementation adopts the following protocol. Let a data sample $x$ is vertically partitioned to $(x_0, x_1, x2, ..., x{n-1})$ across $n$ parties. The $0$-th party (active party) has the ground truth $y$.
$k$-th passive party ($k = 1, 2, ..., n-1$) encrypts and sends $o_k = f_k(x_k)$ to the active party.
The active party averages the encrypted predictions and constructs the final predictions $[[\hat{y}]] = \frac{1}{n} (o0 + (\sum{k=1} [[o_k]]))$.
The active party calculates the loss gradient w.r.t the output of each local model. For example, if the loss is MSE, $\frac{\nabla \ell}{\nabla o_k} \sim (y - o_k)$.
The active party sends the encrypted gradient to passive parties. Passive parties decrypt the gradient and backward it to update the local models.
torch.Tensor
for Paillier Encryption (use python-pallier)The current implementation has the following limitations:
average
as the aggregation.Specifically, the current implementation adopts the following protocol. Let a data sample $x$ is vertically partitioned to $(x_0, x_1, x2, ..., x{n-1})$ across $n$ parties. The $0$-th party (active party) has the ground truth $y$.
Demo