reproducibility-challenge / iclr_2019

ICLR Reproducibility Challenge 2019
https://reproducibility-challenge.github.io/iclr_2019/
219 stars 40 forks source link

Submission for Issue #79 #150

Closed ArnoutDevos closed 5 years ago

ArnoutDevos commented 5 years ago

79

ArnoutDevos commented 5 years ago

@reproducibility-org complete

reproducibility-org commented 5 years ago

Hi, please find below a review submitted by one of the reviewers:

Score: 8 Reviewer 1 comment : This works tries to reproduce meta-learning with MAML and "differentiable close-form solvers". The results are matched to the original paper. Hyper-parameters are well-searched. Though, the writing of this report can be improved and more ablation study can be include. Confidence : 3

reproducibility-org commented 5 years ago

Hi, please find below a review submitted by one of the reviewers:

Score: 8 Reviewer 2 comment : This report makes an effort to reproduce the main results in the paper by Bertinetto et al. (2019).

In the absence of any open source code release, the authors implemented code to reproduce its result starting from original paper and a previous implementation by Finn (2018), and share their code publicly on github. The submission also specifies what libraries and versions were used in their code, as well as the type of hardware on which the experiments were run.

This reproducibility work clearly states which experiments and scenarios they aimed to reproduce, what procedure they followed to implement the necessary code, and what assumptions and decisions they had to make to get around the lack of implementation details provided in the original manuscript, thus directly pointing at a shortcoming in the paper they were analyzing. The availability of hyperparameter values chosen by the authors of this work provides a more solid baseline for future reproducibility efforts in this domain and provides a concrete suggestion to Bertinetto et al. on how to improve the impact and extensibility of their work. In fact, the authors of this reproducibility analysis shared their findings with the original authors using OpenReview, and the original authors were able to improve their contribution by addressing some of the issues raised in this reproducibility report.

Paragraph 4 includes a thoughtful discussion on the repercussions on reproducibility and fairness of comparison with prior literature of the choice varying the number of classes at training time, which points to the care and attention to detail employed by the authors in this work.

While the statement on the vagueness of the stopping criterion chosen in the work by Bertinetto et al. (2019) is valid, the choice made in this report certainly does not seem to match the original description.

The introduction paragraph could use more citations to prior work.

A long (perhaps excessive?) background paragraph provides a pedagogical introduction to the architecture introduced by Bertinetto et al. (2019) and to the broader field of meta-learning. Although this is helpful to assess the level of familiarity of the author with the subject, it may not be relevant and appropriate for a reproducibility analysis paper.

The language used in the paper is at times too colloquial.

Sufficient experimentation and an attempt at discussing the observed results are present in this report. More in depth work to determine the compatibility of these results with the original ones, including better estimation of possible systematic deviations due to hyperparameter choices would be beneficial. Confidence : 4

reproducibility-org commented 5 years ago

Hi, please find below a review submitted by one of the reviewers:

Score: 9 Reviewer 3 comment : TA Review

NB : This TA review has been provided by the institution directly and authors have communicated with the reviewers regarding changes / updates. Confidence : 4

reproducibility-org commented 5 years ago

Meta Reviewer Decision: Accept