bargavj / EvaluatingDPML

This project's goal is to evaluate the privacy leakage of differentially private machine learning models.
MIT License
129 stars 49 forks source link

How to get non-private graphs #29

Closed sheikhnazimuddin closed 2 years ago

sheikhnazimuddin commented 2 years ago

I have run this code and getting all the graphs, such as accuracy loss and privacy leakage at various privacy budgets. I was wondering how can I generate the privacy leakage from the non-private model and plot it along with the perturbed results? I'd greatly appreciate if could help. Thanks.

bargavj commented 2 years ago

Hi, I believe you are referring to the interpret_results.py for 'improved_mi'. You can get the privacy leakage for non-private model by not passing any 'eps' value as argument. Since by default 'eps' is set to None, the code will print the results for non-private model.