back-translation (Sennrich et al., 2016; Edunov et al., 2018; Xie et al., 2019), c-BERT word replacement (Wu et al., 2019), mixup (Guo et al., 2019; Chen et al., 2020a), Cutoff (Shen et al., 2020), adversarial training
如何将他们集成在一起很少有人研究
2.2 对比学习
CV方法用在NLP的一些工作
3 实验验证任务,如果不太熟悉,需要简单描述
GLUE的文本分类场景
4 在你认知范围内,哪些其它任务可以尝试
数据增强可以扩展在任何任务上
5 好的词语、句子或段落
However, this is especially challenging for natural language, given that the semantics of a sentence can be entirely altered by slight perturbations.
提出了一种新的数据增强框架,
信息
1 学习到的新东西:
2 通过Related Work了解到了哪些知识
2.1 数据增强
back-translation (Sennrich et al., 2016; Edunov et al., 2018; Xie et al., 2019), c-BERT word replacement (Wu et al., 2019), mixup (Guo et al., 2019; Chen et al., 2020a), Cutoff (Shen et al., 2020), adversarial training 如何将他们集成在一起很少有人研究
2.2 对比学习
CV方法用在NLP的一些工作
3 实验验证任务,如果不太熟悉,需要简单描述
GLUE的文本分类场景
4 在你认知范围内,哪些其它任务可以尝试
数据增强可以扩展在任何任务上
5 好的词语、句子或段落
However, this is especially challenging for natural language, given that the semantics of a sentence can be entirely altered by slight perturbations.