mofanv / PPFL

Privacy-preserving Federated Learning with Trusted Execution Environments
MIT License
63 stars 24 forks source link

PPFL

This is an application that runs privacy-preserving federated learning with Trusted Execution Environments. A layer-wise training technique is used for keeping the training layers always inside trusted areas. The application has two parts: 1) the server side with SGX using OpenEnclave SDK for secure aggregation; 2) the client-side with TrustZone using OP-TEE.


Please consider citing the corresponding paper at MobiSys 2021 if this project is helpful to you:

PPFL: Privacy-preserving Federated Learning with Trusted Execution Environments Fan Mo, Hamed Haddadi, Kleomenis Katevas, Eduard Marin, Diego Perino, Nicolas Kourtellis

Also check our Teaser Video and Video Presentation.

Prerequisites

To run this application, you will need one SGX-enabled PC as the server and one TrustZone-enabled device as the client.

Check this link and make sure your PC has SGX.

Check this link for TrustZone-enabled devices with OPTEE supports.

Setup

(1) Set up Client Side

Follow client side instruction. In the end, make sure you can train a model on-device using TrustZone. For example, test it using the command below:

darknetp classifier train -pp_start 4 -pp_end 10 cfg/mnist.dataset cfg/mnist_lenet.cfg

When everything is ready, you will see output showing that it starts training the model.

(2) Set up Server Side

Step 1:

You will need Ubuntu 18.04 PC and first install Open Enclave SDK for supporting the use of SGX. Follow this instruction to do so.

To test whether you are able to run samples/applications, build samples following BuildSamplesLinux, and then try to run the Hello World sample.

Step 2:

First, clone the (server-side) code and compile it using Open Enclave SDK.

git clone https://github.com/mofanv/privacy-preserving-federated-learning.git
cd ./privacy-preserving-federated-learning/server_side_sgx/
make build
make run

Step 3:

set IP_CLIENT_1, PASSWORD_CLIENT1, NUM_ROUNDS. NUM_CLIENTS, PP_START, PP_END, DATASET, MODEL, etc in fl_script/fl_tee*.sh file. where,

Step 4:

run fl_tee*.sh for FL with TEE protection on training at both server and client-side. For example,

cd fl_script/
./fl_tee_layerwise.sh

Note that in the worst case you could try to run both sides with simulation, i.e., Open Enclave on PC without SGX and OPTEE in QEMU. However, this is not tested.

License

MIT License

Copyright (c) 2021 Fan Mo, f.mo18@imperial.ac.uk

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.