Open zhangfanTJU opened 3 years ago
标准的对比学习模型:SimCLR,日后改进的基础
对比学习:CV领域的BERT(自监督学习),无需标注数据,自己构造相似数据(正例)以及不相似数据(负例)
The best result obtained with our ResNet-50 (4×) can match the supervised pretrained ResNet-50.
句子embedding
The recent renaissance of self- supervised learning began with artificially designed pretext(pre-training) tasks.
标准的对比学习模型:SimCLR,日后改进的基础
信息
1 学习到的新东西:
对比学习:CV领域的BERT(自监督学习),无需标注数据,自己构造相似数据(正例)以及不相似数据(负例)
2 通过Related Work了解到了哪些知识
2.1 对比学习的两种范式:
2.2 表示学习的负例构造方式
3 实验验证任务,如果不太熟悉,需要简单描述
The best result obtained with our ResNet-50 (4×) can match the supervised pretrained ResNet-50.
4 在你认知范围内,哪些其它任务可以尝试
句子embedding
5 好的词语、句子或段落
The recent renaissance of self- supervised learning began with artificially designed pretext(pre-training) tasks.