Closed charlesndirutu33 closed 2 years ago
👋 @charlesndirutu33 Good afternoon and thank you for submitting your topic suggestion. Your topic form has been entered into our queue and should be reviewed (for approval) as soon as a content moderator is finished reviewing the ones in the queue before it.
Sounds like a helpful topic - let's please be sure it adds value beyond what is in any official docs and/or what is covered in other blog sites. (the articles should go beyond a basic explanation - and it is always best to reference any EngEd article and build upon it). @charlesndirutu33
Please be attentive to grammar/readability and make sure that you put your article through a thorough editing review prior to submitting it for final approval. (There are some great free tools that we reference in EngEd resources.) ANY ARTICLE SUBMITTED WITH GLARING ERRORS WILL BE IMMEDIATELY CLOSED.
Please be sure to double-check that it does not overlap with any existing EngEd articles, articles on other blog sites, or any incoming EngEd topic suggestions (if you haven't already) to avoid any potential article closure, please reference any relevant EngEd articles in yours. - Approved
closed #7179 via #7318
Proposal Submission
Proposed title of article
[Machine Learning] Natural language processing using TensorFlow and Bert Model
Proposed article introduction
Natural language processing is the application of computational techniques to the analysis and synthesis of natural language and speech. Hugging Face provides thousands of pre-trained models to perform tasks on different modalities such as text, vision, and audio. Text, for tasks like text classification, information extraction, question answering, summarization, translation, and text generation. Images, for tasks like image classification, object detection, and segmentation. Audio, for tasks like speech recognition and audio classification.
Hugging Face Transformers support pre-trained models such as ALBERT, BERT, BART, Barthez, etc. In this tutorial, we will be focusing on the BERT model. Bidirectional Encoder Representations from Transformers (BERT) is a transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. BERT is designed to help computers understand the meaning of ambiguous language in the text by using surrounding text to establish context.
We import BERT from the hugging face transformers. We will use the model to build a custom sentiment analysis. Sentiment analysis is an NLP task that classifies customers reviews. The reviews can be:
0 - negative
,1 - somewhat negative
,2 - neutral
,3 - somewhat positive
and4 - positive
. We will use TensorFlow to fine-tune the model.Key takeaways
Article quality
This article is unique because it explains the concept of transformers in detail. Using the article the reader will be able to know the different transformer models before focusing on BERT. It also covers the basic BERT architecture, this will enable the reader to know how to fine-tune the model to understand sentiment analysis. The article will also cover HuggingFaceTransforemers, we will discuss the different tasks that the libraries can solve. The tutorial uses detailed steps that a reader can easily follow.
References
Please list links to any published content/research that you intend to use to support/guide this article.
Conclusion
Finally, remove the Pre-Submission advice section and all our blockquoted notes as you fill in the form before you submit. We look forwarding to reviewing your topic suggestion.
Templates to use as guides