Paper | Usage | Citation | Video |
This repository contains the official code for our ICLR 2023 spotlight paper MPCFormer: fast, performant, and private transformer inference with MPC. We design MPCFormer to protect users' data privacy by using Secure Multiparty Computation(MPC). It also meets other real-world requirements:
It achieves 5.26x speedup for Bert-Base MPC inference, while preserving a similar ML accuracy. More comprehensive results such as on Bert-Large, Roberta, can be found in the paper.
To install necessary packages, install the transformer directory in editor mode:
git clone https://github.com/MccRee177/MPCFormer
cd MPCFormer/transformers
pip install -e .
We support GLUE and Imdb, other datasets can be easily supported via the ransformers library.
If you find this repository useful, please cite our paper using
@article{li2022mpcformer,
title={MPCFormer: fast, performant and private Transformer inference with MPC},
author={Li, Dacheng and Shao, Rulin and Wang, Hongyi and Guo, Han and Xing, Eric P and Zhang, Hao},
journal={arXiv preprint arXiv:2211.01452},
year={2022}
}