Denis2054 / Transformers-for-NLP-2nd-Edition

Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E including jump starting GPT-4, speech-to-text, text-to-speech, text to image generation with DALL-E, Google Cloud AI,HuggingGPT, and more
https://denis2054.github.io/Transformers-for-NLP-2nd-Edition/
MIT License
787 stars 293 forks source link
bert chatgpt chatgpt-api dall-e dall-e-api deep-learning gpt-3-5-turbo gpt-4 gpt-4-api huggingface-transformers machine-learning natural-language-processing nlp openai python pytorch roberta-model transformers trax

Transformers-for-NLP-2nd-Edition

drawing

©Copyright 2022-2024, Denis Rothman, Packt Publishing

Last updated: January 4, 2024

Dolphin 🐬 Additional Bonus programs for OpenAI ChatGPT(GPT-3.5 legacy), ChatGPT Plus(GPT-3.5 default, GPT 3.5 default, and GPT-4).
API examples for GPT-3.5-turbo, GPT-4, DALL-E 2, Google Cloud AI Language, and Google Cloud AI Vision.
Discover HuggingGPT, Google Smart Compose, Google BARD, and Microsoft's New Bing .
Advanced prompt engineering with the ChatGPT API and the GPT-4 API.

Just look for the Dolphin 🐬 and enjoy your ride into the future of AI!

Contact me on LinkedIn
Get the book on Amazon

Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E, including jump-starting GPT-4, speech-to-text, text-to-speech, text-to-image generation with DALL-E and more.

Getting started

You can run these notebooks on cloud platforms like Google Colab or your local machine. Note that some chapters require a GPU to run in a reasonable amount of time, so we recommend one of the cloud platforms as they come pre-installed with CUDA.

Getting started with OpenAI API

December 6, 2023. OpenAI is currently updating its platform. If you encounter issues with the notebooks of this repository, you can implement the following tips:

You can find examples of these update tips in thee following notebooks that you can apply to other notebooks if necessary: -Getting_Started_GPT_3.ipynb), Summarizing_with_ChatGPT.ipynb, and Semantic_Role_Labeling_with_ChatGPT.ipynb

Running on a cloud platform or in your environment

To run these notebooks on a cloud platform, just click on one of the badges in the table below or run them on your environment.

Chapter Colab Kaggle Gradient StudioLab

Chapter 2: Getting Started with the Architecture of the Transformer Model |

| Open In Colab Open In Colab | Kaggle Kaggle | Gradient Gradient | Open In SageMaker Studio Lab Open In SageMaker Studio Lab | Chapter 3: Fine-Tuning BERT Models | | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | Chapter 4 : Pretraining a RoBERTa Model from Scratch | Pretraining a RoBERTa Model from Scratch | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | Chapter 5: Downstream NLP Tasks with Transformers | | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | Chapter 6 Machine Translation with the Transformer | | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | Chapter 7: The Rise of Suprahuman Transformers with GPT-3 Engines | | Open In Colab Open In Colab| Kaggle Kaggle| Gradient Gradient | Open In SageMaker Studio Lab Open In SageMaker Studio Lab| Chapter 8 : Applying Transformers to Legal and Financial Documents for AI Text Summarization | | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | Chapter 9 : Matching Tokenizers and Datasets | | Open In Colab Open In Colab Open In Colab| Kaggle Kaggle Kaggle| Gradient Gradient Gradient | Open In SageMaker Studio Lab Open In SageMaker Studio LabOpen In SageMaker Studio Lab | Chapter 10 : Semantic Role Labeling | | Open In Colab Open In Colab | Kaggle Kaggle | Gradient Gradient | Open In SageMaker Studio Lab Open In SageMaker Studio Lab | Chapter 11 : Let Your Data Do the Talking: Story, Questions, and Answers | | Open In Colab Open In Colab | Kaggle Kaggle | Gradient Gradient | Open In SageMaker Studio Lab Open In SageMaker Studio Lab | Chapter 12 Detecting Customer Emotions to Make Predictions | | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | Chapter 13 : Analyzing Fake News with Transformers | | Open In Colab Open In Colab | Kaggle Kaggle | Gradient Gradient | Open In SageMaker Studio Lab Open In SageMaker Studio Lab | Chapter 14 : Interpreting Black Box Transformer Models | | Open In Colab Open In Colab| Kaggle Kaggle| Gradient Gradient Gradient | Open In SageMaker Studio Lab Open In SageMaker Studio Lab| Chapter 15: From NLP to Task-Agnostic Transformer Models | | Open In Colab Open In Colab | Kaggle Kaggle | Gradient Gradient | Open In SageMaker Studio Lab Open In SageMaker Studio Lab | Chapter 16 : The Emergence of Transformer-Driven Copilots | | Open In Colab Open In Colab Open In Colab Open In Colab | Kaggle Kaggle KaggleKaggle | Gradient Gradient Gradient Gradient| Open In SageMaker Studio Lab Open In SageMaker Studio Lab Open In SageMaker Studio Lab Open In SageMaker Studio Lab | Chapter 17 :🐬 Consolidation of Suprahuman Transformers with OpenAI ChatGPT and GPT-4 | | Open In Colab Open In Colab Open In Colab Open In Colab Open In Colab Open In Colab Open In Colab| Kaggle Kaggle KaggleKaggleKaggleKaggleKaggle | Gradient Gradient Gradient Gradient Gradient GradientGradient| Open In SageMaker Studio Lab Open In SageMaker Studio Lab Open In SageMaker Studio Lab Open In SageMaker Studio Lab Open In SageMaker Studio Lab Open In SageMaker Studio Lab Open In SageMaker Studio Lab | Appendix III: Generic Text Completion with GPT-2 | | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | Appendix IV: Custom Text Completion with GPT-2 | | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab |

Additional OpenAI Bonus Notebooks

Bonus Colab Kaggle Gradient SageMaker Studio Lab

🐬Explore and compare ChatGPT, GPT-4 and GPT-3 models | Exploring_GPT_4_API | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | 🐬Create a ChatGPT XAI function that explains ChatGPT and an XAI SHAP function | XAI_by_ChatGPT_for_ChatGPT | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | 🐬Go back to the origins with GPT-2 and ChatGPT | GPT_2_and_ChatGPT_the_Origins | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | 🐬ChatGPT or davinin_instruct? What is best for your project? | ChatGPT_as_a_Cobot_ChatGPT_versus_davinci_instruct.ipynb | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab | 🐬AI Language Model Comparison
-Explore various AI language models and their capabilities through this comprehensive notebook.
-Dive into different APIs and functionalities, such as sentiment analysis, entity recognition, syntax analysis, content classification, and AI vision.
-Discover and compare the offerings of Google Cloud AI Language,Google Cloud AI Vision, OpenAI GPT-4, Google Bard, Microsoft New Bing, ChatGPT Plus-GPT-4, Hugging Face, HuggingGPT, and Google Smart Compose.
December 6, 2023 update: In the newer versions of Gradio, the way inputs are defined has been updated. Instead of using gr.inputs.Textbox, now use gr.Textbox directly for the inputs and outputs. | Exploring_and_Comparing_Advanced_AI_Technologies.ipynb | Open In Colab | Kaggle | Gradient | Open In SageMaker Studio Lab |

Key Features

Implement models, such as BERT, Reformer, and T5, that outperform classical language models
Compare NLP applications using GPT-3, GPT-2, and other transformers
Analyze advanced use cases, including polysemy, cross-lingual learning, and computer vision. A GitHub BONUS directory with SOA ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E notebooks.

Book Description

Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence.

Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, language modeling, question-answering, and many more NLP domains with transformers.

An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP.

This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description.

By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.

What you will learn

Discover new ways of performing NLP techniques with the latest pretrained transformers
Grasp the workings of the original Transformer, GPT-3, BERT, T5, DeBERTa, and Reformer
Create language understanding Python programs using concepts that outperform classical deep learning models
Apply Python, TensorFlow, and PyTorch programs to sentiment analysis, text summarization, speech recognition, machine translations, and more
Measure the productivity of key transformers to define their scope, potential, and limits in production

Who This Book Is For

If you want to learn about and apply transformers to your natural language (and image) data, this book is for you.

A good understanding of NLP, Python, and deep learning is required to benefit most from this book. Many platforms covered in this book provide interactive user interfaces, which allow readers with a general interest in NLP and AI to follow several chapters of this book.

Table of Contents

1.What are Transformers?
2.Getting Started with the Architecture of the Transformer Model
3.Fine-Tuning BERT models
4.Pretraining a RoBERTa Model from Scratch
5.Downstream NLP Tasks with Transformers
6.Machine Translation with the Transformer
7.The Rise of Suprahuman Transformers with GPT-3 Engines
8.Applying Transformers to Legal and Financial Documents for AI Text Summarization
9.Matching Tokenizers and Datasets
10.Semantic Role Labeling with BERT-Based Transformers
11.Let Your Data Do the Talking: Story, Questions, and Answers
12.Detecting Customer Emotions to Make Predictions
13.Analyzing Fake News with Transformers
14.Interpreting Black Box Transformer Models
15.From NLP to Task-Agnostic Transformer Models
16.The Emergence of Transformer-Driven Copilots
17.The Consolidation of Suprahuman Transformers with OpenAI's ChatGPT and GPT-4
Appendix I: Terminology of Transformer Models
Appendix II: Hardware Constraints for Transformer Models
And more!