eagle705 / presentation

presentation pdf collection
6 stars 1 forks source link

RoBERTa: A Robustly Optimized BERT Pretraining Approach #7

Open eagle705 opened 2 years ago

eagle705 commented 2 years ago

Author

느낀점

Abstract

Introduction

Background

Training Objectives

Experimental Setup

Implementation

Data

Evaluation

Training Procedure Analysis

Model Input Format and Next Sentence Prediction

image

Results

Training with large batches

Text Encoding

RoBERTa

image

Results

GLUE Results

SQuAD Results

image

RACE Results

image

Conclusion

Params

image image