atharvapurdue / text_summarization

Researching and Building a text summarization AI models
Apache License 2.0
0 stars 0 forks source link

Model - seq2seq #4

Open marktrovinger opened 1 year ago

marktrovinger commented 1 year ago

As one of the models we are looking at for our project, seq2seq needs an issue to work against. We will need to develop a workflow, but at the moment in checklist order:

marktrovinger commented 1 year ago

So, HuggingFace has a great series about how to use a pre-trained model for different NLP tasks: https://huggingface.co/course/chapter3/1?fw=pt

A little farther down, they talk about using pre-trained models to do summarization: https://huggingface.co/course/chapter7/5?fw=pt

It looks like we can use T5 as our pre-trained base for Seq2Seq, and BART/BERT when we look at those models.