section-engineering-education / engineering-education

“Section's Engineering Education (EngEd) Program is dedicated to offering a unique quality community experience for computer science university students."
Apache License 2.0
363 stars 889 forks source link

[Machine Learning] Implementing Bert Model using Hugging Face Transformers #6426

Closed kelvinkimani501 closed 2 years ago

kelvinkimani501 commented 2 years ago

Proposal Submission

Proposed title of article

Implementing Bert Model using Hugging Face Transformers

Proposed article introduction

Hugging Face Transformers provides thousands of pre-trained models to perform tasks on different modalities such as text, vision, and audio. Text, for tasks like text classification, information extraction, question answering, summarization, translation, and text generation. Images, for tasks like image classification, object detection, and segmentation. Audio, for tasks like speech recognition and audio classification.

Hugging Face Transformers support pre-trained models such as ALBERT, BERT, BART, Barthez, etc. In this tutorial, we will be focusing on the BERT model. Bidirectional Encoder Representations from Transformers (BERT) is a transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. BERT is designed to help computers understand the meaning of ambiguous language in the text by using surrounding text to establish context.

We import BERT from the hugging face transformers. We will use the model to build a custom sentiment analysis. Sentiment analysis is an NLP task that classifies customers reviews as either positive or negative.

Key takeaways

  1. What is the Hugging Face library?
  2. Installing the HuggingFace transformers.
  3. Exploring the pre-trained models such as ALBERT, BERT, BART, Barthez, etc.
  4. Importing BERT
  5. Building the model using BERT.

Article quality

This article is unique because it explains the concept of transformers in detail. Using the article the reader will be able to know the different transformer models before focusing on BERT. It also covers the basic BERT architecture, this will enable the reader to know how to fine-tune the model to understand sentiment analysis. The article will also cover HuggingFaceTransforemers, we will discuss the different tasks that the libraries can solve. The tutorial uses detailed steps that a reader can easily follow.

References

Please list links to any published content/research that you intend to use to support/guide this article.

Conclusion

Finally, remove the Pre-Submission advice section and all our blockquoted notes as you fill in the form before you submit. We look forwarding to reviewing your topic suggestion.

Templates to use as guides

lalith1403 commented 2 years ago

Good afternoon and thank you for submitting your topic to the EngEd program. After some careful consideration, it struck us that this topic may be a bit over-saturated throughout other blog sites and official documentations.

We typically refrain from publishing content that is covered widely on the net or other blogs. We're more interested in original, practitioner-focused content that takes a deeper dive into programming-centric concepts.

But in order to approve the topic, it has to serve value to the larger developer community at large. An option and a great way to write this as an in-depth article and make it more add value to the greater developer community at large would be to walk the reader through the USE of methods and functions by building a unique, different, useful project.

We would love to learn your thought process behind the solution you arrived at using these concepts and topics.

That way a developer could see them in action. As mentioned above - we believe this topic is widely covered on other blog sites.

The best way for students to build a great portfolio is by building what does not exist and what can provide the most value.