Closed willyngashu closed 2 years ago
👋 @willyngashu Good afternoon and thank you for submitting your topic suggestion. Your topic form has been entered into our queue and should be reviewed (for approval) as soon as a content moderator is finished reviewing the ones in the queue before it.
@lalith1403 Please correct me if I am mistaken here.
Sounds like a helpful topic - lets please be sure it adds value beyond what is in any official docs and/or what is covered in other blog sites. (the articles should go beyond a basic explanation - and it is always best to reference any EngEd article and build upon it).
Please be attentive to grammar/readability and make sure that you put your article through a thorough editing review prior to submitting for final approval. (There are some great free tools that we reference in EngEd resources.) ANY ARTICLE SUBMITTED WITH GLARING ERRORS WILL BE IMMEDIATELY CLOSED. Please be sure to double check that it does not overlap with any existing EngEd articles, articles on other blog sites, or any incoming EngEd topic suggestions (if you haven't already) to avoid any potential article closure, please reference any relevant EngEd articles in yours. - Approved
@willyngashu
Proposal Submission
Proposed title of article
[Machine Learning] Implementing undersampling, oversampling and SMOTE techniques to handle imbalanced data in deep neural networks
Proposed article introduction
Imbalanced data refers to those types of datasets where the target class has an uneven distribution of observations, i.e one class label has a very high number of observations and the other has a very low number of observations. Classes that make up a large proportion of the data set are called majority classes. Those that make up a smaller proportion are minority classes. Training a model on an imbalanced dataset requires making certain adjustments otherwise the model will not perform as per your expectations.
We have various techniques that are used to handle the imbalanced dataset such undersampling technique, oversampling technique, and the SMOTE technique.
In this tutorial, we will implement each method practically when building the deep neural network. We will implement a deep learning model and add all the required layers.
Key takeaways
Article quality
This tutorial is unique because we will implement each of these methods when building the deep neural network. We will build a deep learning model for credit fraud detection. We will use metric scores such as use precision, recall, f1-score to measure the performance of our model after balancing the classes. We will compare the results with the original model that uses the imbalanced dataset to know if the model has improved.
References
Please list links to any published content/research that you intend to use to support/guide this article.
Conclusion
Finally, remove the Pre-Submission advice section and all our blockquoted notes as you fill in the form before you submit. We look forwarding to reviewing your topic suggestion.
Templates to use as guides