Open marktrovinger opened 1 year ago
So, HuggingFace has a great series about how to use a pre-trained model for different NLP tasks: https://huggingface.co/course/chapter3/1?fw=pt
A little farther down, they talk about using pre-trained models to do summarization: https://huggingface.co/course/chapter7/5?fw=pt
It looks like we can use T5 as our pre-trained base for Seq2Seq, and BART/BERT when we look at those models.
As one of the models we are looking at for our project, seq2seq needs an issue to work against. We will need to develop a workflow, but at the moment in checklist order: