This pull request introduces support for training models using the DPSGD (Differentially Private Stochastic Gradient Descent) algorithm, ensuring differential privacy during training. DPSGD allows the model to learn while preserving the privacy of individual training examples, making it suitable for sensitive data scenarios.
This looks good but I guess you should use a library (or write on your own) for privacy accounting or ideally pass epsilon as a parameter in the config.
Also, can you update the config and put "if/else" wherever necessary so that anyone can use the dpsgd routine.
This pull request introduces support for training models using the DPSGD (Differentially Private Stochastic Gradient Descent) algorithm, ensuring differential privacy during training. DPSGD allows the model to learn while preserving the privacy of individual training examples, making it suitable for sensitive data scenarios.