ngoquanghuy99 / transformer-summarization

An abstractive text summarization model based on Transformer Decoder (GPT-2) using Google/Trax.
17 stars 1 forks source link
gpt-2 natural-language-generation natural-language-processing nlp summarization transformer trax

Abstractive summarization using Generative Pretrained Transformer (GPT-2)

This is my Trax implementation of GPT-2 (Transformer Decoder) for one of the Natural Language Generation task, Abstractive summarization.

Paper: Language Models are Unsupervised Multitask Learners.

Library: Trax - Deep Learning Library in JAX actively used and maintained in the Google Brain team.

Dataset: https://www.kaggle.com/shashichander009/inshorts-news-data