FedML-AI / FedNLP

FedNLP: An Industry and Research Integrated Platform for Federated Learning in Natural Language Processing, Backed by FedML, Inc. The Previous Research Version is Accepted to NAACL 2022
223 stars 45 forks source link

Too many unused configuration/functionalities in the model trainer class (e.g., model/fed_transformers/classification/classification_model.py #3

Closed chaoyanghe closed 3 years ago

chaoyanghe commented 3 years ago

for example, ONNX, quantization

Another issue is that a single function has 400 lines of code. It's better to simplify this class for FL. The current code is mainly for centralized training. Can we do it in 100 lines or split it as more functions or classes?

yuchenlin commented 3 years ago

Yeah, I think it can be simplified. You can go ahead and do it by yourself based on your ideas. I will review it when you are done. Just make sure it can run with the TransformerTrainer. Let me know when you have bug testing it.

chaoyanghe commented 3 years ago

Yeah, I think it can be simplified. You can go ahead and do it by yourself based on your ideas. I will review it when you are done. Just make sure it can run with the TransformerTrainer. Let me know when you have bug testing it.

Okay. If the trainning is not blocked, let's keep it as it is before ICML deadline. I will do it after that. Currently, I don't have such bandwidth to refactor it.

chaoyanghe commented 3 years ago

@yuchenlin I am refactoring this class. Will let you review the code once I finished.

yuchenlin commented 3 years ago

sure, take ur time. no need to rush it for now. You can start with the ClassificationModel