issues
search
chaos-moon
/
paper_daily
One paper a day, keep laziness away.
MIT License
7
stars
3
forks
source link
insight papers
#26
Open
zc12345
opened
1 year ago
zc12345
commented
1 year ago
Is Pre-training Truly Better Than Meta-Learning?
arxiv
institution: 斯坦福大学
TL;DR
当数据集的形式多样性较低时,PT 平均优于 MAML
当形式多样性较高时,MAML 平均优于 PT
上述结论中的优劣差异,在本文使用的更严格统计学指标 the effect size (Cohen's d) 下的都很小。但是总体而言推翻了之前PT优于MAML的结论,数据集因素很关键
utils
对标的论文是20年的
Rethinking Few-Shot Image Classification: a Good Embedding Is All You Need?
作者认为尽管高多样性的数据集下MAML效果更好,但是PT更容易训练
本文的对比是在the same architecture, the same optimizer, and all models trained to convergence的条件下
Is Pre-training Truly Better Than Meta-Learning?
TL;DR
utils