JasonDCox / ML-Mentorship-GovSchool

0 stars 0 forks source link

Ensemble Methods for Supervised Machine Learning #2

Closed gavinjalberghini closed 2 years ago

gavinjalberghini commented 2 years ago

Description: Commonly in ML practice, it is not enough to run a single classifier. Through several research studies, it is shown that multiple classifiers working together can capture more complex knowledge products. For this task, I would like you both to read about ensemble learning. Also, read about bagging and boosting practices and how they increase predictive performance. Be thinking about how you could apply this practice to the kNN. How would you combine predictions from multiple kNNs? You will likely be asked to do this in the coming weeks. Notes may be beneficial.

Acceptance Criteria: Read from the supporting documentation below

Articles: What is ensemble learning - http://www.scholarpedia.org/article/Ensemble_learning Why ensembles - https://towardsdatascience.com/ensemble-methods-in-machine-learning-what-are-they-and-why-use-them-68ec3f9fef5f What is bagging - https://www.ibm.com/cloud/learn/bagging What is boosting - https://towardsdatascience.com/what-is-boosting-in-machine-learning-2244aa196682

Research: https://sci-hubtw.hkvisa.net/10.1002/widm.1249

gavinjalberghini commented 2 years ago

V: 2

brandonC1234 commented 2 years ago

Complete - Brandon