Open hshokrig opened 1 year ago
Our MLoN workshop is on March 30 and 31.
Each Study Group (SG) is supposed to prepare a 45 minutes oral presentation, followed by 15 minutes Q&A. Moreover, each SG should design a poster from their presentation for the poster session to take additional questions we cannot take in Q&A. .
In the following, I provided some potential topics. But you are welcome to suggest new topics as well. You should choose some prominent papers (does not have to be new or the ones we suggested) to cover various aspects of that topic. Send me ASAP, by replying to this issue, a list of three topics for your group (with your priority ranks). We will assign the topics to you afterward and your scheduled presentation time. ALL MEMBERS SHOULD PRESENT ON THE VIRTUAL WORKSHOP DAYS IF THEY WANT THE COURSE CREDIT. Exceptions are allowed for emergency reasons but you should inform us a priory if you cannot attend.
Special Topic 1: Privacy and security in distributed learning
Special Topic 2: Handling Non-IID datasets
Special Topic 3: Model Compression
Special Topic 4: Error compensation and variance reduction
Hi!
We are group 1 (Yasaman, Firooz and Ozan). We are mainly interested in 4th one about distributed variance reduction and decentralized constrained optimization since Firooz has been working in this area. We have a list of papers that we are planning to present :
Distributed Subgradient Projection Algorithm Over Directed Graphs (https://ieeexplore.ieee.org/document/7582416)
In this case, our ranking is:
4th topic
2nd topic
3rd topic
Best regards, Ozan
From: Hana @.***> Sent: Sunday, March 12, 2023 6:01:16 PM To: hshokrig/EP3260-MLoNs-2023 Cc: Subscribed Subject: Re: [hshokrig/EP3260-MLoNs-2023] new lecture and CAs uploaded (Issue #7)
Our MLoN workshop is on March 30 and 31.
Each Study Group (SG) is supposed to prepare a 45 minutes oral presentation, followed by 15 minutes Q&A. Moreover, each SG should design a poster from their presentation for the poster session to take additional questions we cannot take in Q&A. .
In the following, I provided some potential topics. But you are welcome to suggest new topics as well. You should choose some prominent papers (does not have to be new or the ones we suggested) to cover various aspects of that topic. Send me ASAP, by replying to this issue, a list of three topics for your group (with your priority ranks). We will assign the topics to you afterward and your scheduled presentation time. ALL MEMBERS SHOULD PRESENT ON THE VIRTUAL WORKSHOP DAYS IF THEY WANT THE COURSE CREDIT. Exceptions are allowed for emergency reasons but you should inform us a priory if you cannot attend.
Special Topic 1: Privacy and security in distributed learning
Kairouz, Peter, et al. "Advances and open problems in federated learning." arXiv preprint arXiv:1912.04977 (2019). Abadi, Martin, et al. "Deep learning with differential privacy." Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. 2016. Dwork, Cynthia, and Aaron Roth. "The algorithmic foundations of differential privacy." Foundations and Trends® in Theoretical Computer Science 9.3–4 (2014): 211-407. Du, Wenliang, and Mikhail J. Atallah. "Secure multi-party computation problems and their applications: a review and open problems." Proceedings of the 2001 workshop on New security paradigms. 2001.
Special Topic 2: Handling Non-IID datasets
Zhao, Yue, et al. "Federated learning with non-iid data." arXiv preprint arXiv:1806.00582 (2018). Sattler, Felix, et al. "Robust and communication-efficient federated learning from non-iid data." IEEE transactions on neural networks and learning systems (2019). Li, Xiang, et al. "On the convergence of fedavg on non-iid data." arXiv preprint arXiv:1907.02189 (2019).
Special Topic 3: Model Compression
Cheng, Yu, et al. "A survey of model compression and acceleration for deep neural networks." arXiv preprint arXiv:1710.09282 (2017). Han, Song, Huizi Mao, and William J. Dally. "Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding." arXiv preprint arXiv:1510.00149 (2015). Iandola, Forrest N., et al. "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and< 0.5 MB model size." arXiv preprint arXiv:1602.07360 (2016).
Special Topic 4: Error compensation and variance reduction
— Reply to this email directly, view it on GitHubhttps://github.com/hshokrig/EP3260-MLoNs-2023/issues/7#issuecomment-1465248275, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AYGQQVRYNLEOITHN2M5CKQLW3YFOZANCNFSM6AAAAAAVYG6H3E. You are receiving this because you are subscribed to this thread.Message ID: @.***>
Hi all!
We are group 3 (Jeannie, Li, Yifei and Yusen). We’d like to work on topic 1 as it is closely related to the research projects that some of us are working on.
Here is our ranking: topic 1, 3, 2.
Best regards, Li
Hi,
We are from group 2 (Zinat, Hansi, Eren, Irshad) and our preference is as follows Topic 2 , Topic 4, and Topic 3.
Regards, Satya.
group 1, @ozanalptopal : topic 4
group 2, @lonaparte : topic 1
group 3, @GaneshSeeram : topic 2
Dear Professor Shokri-Ghadikolaei,
I am writing this email to become sure that our group has understood everything correctly regarding presentations at the FMLoN workshop.
We are assigned the fourth topic, which is related to error compensation and variance reduction in distributed optimization algorithms.
As you mentioned in your previous email, we are allowed to choose other related papers instead of the ones that you suggested.
We suggested three papers. The first paper is my own work and I am very interested in presenting it in the workshop and getting some feedback.
The algorithm removes the variance caused by heterogeneous local objective functions and handles local constraints for the first time, so it is related.
The allocation of the fourth topic to our group means that we can present this paper https://arxiv.org/abs/2210.03232, isn't it?
Best regards,
Group one
On Sun, Mar 12, 2023 at 9:33 PM Ozan Alp Topal @.***> wrote:
Hi!
We are group 1 (Yasaman, Firooz and Ozan). We are mainly interested in 4th one about distributed variance reduction and decentralized constrained optimization since Firooz has been working in this area. We have a list of papers that we are planning to present :
- Double Averaging and Gradient Projection: Convergence Guarantees for Decentralized Constrained Optimization( https://arxiv.org/abs/2210.03232 )
- Push–Pull Gradient Methods for Distributed Optimization in Networks (https://arxiv.org/abs/2210.03232 )
- Distributed Subgradient Projection Algorithm Over Directed Graphs ( https://ieeexplore.ieee.org/document/7582416)
In this case, our ranking is:
4th topic
2nd topic
3rd topic
Best regards, Ozan
From: Hana @.**> Sent: Sunday, March 12, 2023 6:01:16 PM To: hshokrig/EP3260-MLoNs-2023 Cc: Subscribed Subject:* Re: [hshokrig/EP3260-MLoNs-2023] new lecture and CAs uploaded (Issue #7)
Our MLoN workshop is on March 30 and 31.
Each Study Group (SG) is supposed to prepare a 45 minutes oral presentation, followed by 15 minutes Q&A. Moreover, each SG should design a poster from their presentation for the poster session to take additional questions we cannot take in Q&A. .
In the following, I provided some potential topics. But you are welcome to suggest new topics as well. You should choose some prominent papers (does not have to be new or the ones we suggested) to cover various aspects of that topic. Send me ASAP, by replying to this issue, a list of three topics for your group (with your priority ranks). We will assign the topics to you afterward and your scheduled presentation time. ALL MEMBERS SHOULD PRESENT ON THE VIRTUAL WORKSHOP DAYS IF THEY WANT THE COURSE CREDIT. Exceptions are allowed for emergency reasons but you should inform us a priory if you cannot attend.
Special Topic 1: Privacy and security in distributed learning
Kairouz, Peter, et al. "Advances and open problems in federated learning." arXiv preprint arXiv:1912.04977 (2019). Abadi, Martin, et al. "Deep learning with differential privacy." Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. 2016. Dwork, Cynthia, and Aaron Roth. "The algorithmic foundations of differential privacy." Foundations and Trends® in Theoretical Computer Science 9.3–4 (2014): 211-407. Du, Wenliang, and Mikhail J. Atallah. "Secure multi-party computation problems and their applications: a review and open problems." Proceedings of the 2001 workshop on New security paradigms. 2001.
Special Topic 2: Handling Non-IID datasets
Zhao, Yue, et al. "Federated learning with non-iid data." arXiv preprint arXiv:1806.00582 (2018). Sattler, Felix, et al. "Robust and communication-efficient federated learning from non-iid data." IEEE transactions on neural networks and learning systems (2019). Li, Xiang, et al. "On the convergence of fedavg on non-iid data." arXiv preprint arXiv:1907.02189 (2019).
Special Topic 3: Model Compression
Cheng, Yu, et al. "A survey of model compression and acceleration for deep neural networks." arXiv preprint arXiv:1710.09282 (2017). Han, Song, Huizi Mao, and William J. Dally. "Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding." arXiv preprint arXiv:1510.00149 (2015). Iandola, Forrest N., et al. "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and< 0.5 MB model size." arXiv preprint arXiv:1602.07360 (2016).
Special Topic 4: Error compensation and variance reduction
- Stich, Sebastian U., and Sai Praneeth Karimireddy. "The error-feedback framework: Better rates for SGD with delayed gradients and compressed communication." arXiv preprint arXiv:1909.05350 (2019).
- Horváth, Samuel, et al. "Stochastic distributed learning with gradient quantization and variance reduction." arXiv preprint arXiv:1904.05115 (2019).
— Reply to this email directly, view it on GitHub https://github.com/hshokrig/EP3260-MLoNs-2023/issues/7#issuecomment-1465248275, or unsubscribe https://github.com/notifications/unsubscribe-auth/AYGQQVRYNLEOITHN2M5CKQLW3YFOZANCNFSM6AAAAAAVYG6H3E . You are receiving this because you are subscribed to this thread.Message ID: @.***>
You have the last lecture slides and CAs 6 and 7, last ones. The deadline for CAs is Apr 15.