reproducibility-challenge / iclr_2019

ICLR Reproducibility Challenge 2019
https://reproducibility-challenge.github.io/iclr_2019/
219 stars 40 forks source link

DNNtakeover #156

Open HarlinLee opened 5 years ago

HarlinLee commented 5 years ago

Issue #39

reproducibility-org commented 5 years ago

Hi, please find below a review submitted by one of the reviewers:

Score: 6 Reviewer 1 comment : The participants have understood the context and solution of the paper. They explain the work in enough detail to grasp the idea. They have implemented all the components of the pipeline in the paper, and have provided jupyter notebooks to recreate experiments. This makes their experiments straightforward to verify and use further.

The participants have managed to recreate the experiments in the paper. They report numbers close to the original paper even though they do not use the same ensemble. However, the difference in ensemble size is a concern for me, since it seems to be an important part of the pipeline. The paper also does not detail possible hyperparameter search. While the reproducibility of the results is encouraging, it is helpful to understand the extent to which the model was probed to verify this reproducibility.

In trying to pick apart the different components of this pipeline, the participants have been partially successful. They consider a single model, and also attempt to move the pixel2phase layer in different positions to identify its effect, as well as calling out the claim that hiding the seed is a crucial component. But the pipeline offers more straightforward ablations to attempt: testing the permutations and pixel2phase separately on their own. This simple experiment has been ignored in favor of attempting to suggest improvements to the paper. This attempt at suggesting improvements is interesting to me and should be commended. However, it seems to have come at the cost of a necessary experiment.

From the experiments the participants have attempted, they have come up with valid comments and suggestions. However, it is not clear from the paper if these comments have been communicated to the authors. The OpenReview page of the paper does not have a comment by the participants, which would have been helpful for the reviewers, considering the various directions the participants have explored with this paper.

Finally, the report is readable and the figures are clear. There are minor spelling errors, but can be overlooked. Confidence : 3

reproducibility-org commented 5 years ago

Hi, please find below a review submitted by one of the reviewers:

Score: 7 Reviewer 3 comment :

reproducibility-org commented 5 years ago

Hi, please find below a review submitted by one of the reviewers:

Score: 8 Reviewer 2 comment : This paper attempts to replicate the results from the PPD paper, which proposes a new method of dealing with adversarial attacks by shuffling the inputs and performing a Fourier transform.

Overall, the paper is well-written, and the motivation of the approach is clear. What I really like about this paper is how many different ideas the authors try to improve the performance of PPD. These experiments are quite thorough and interesting, and lead to new insights about the approach.

The main drawback of the replication paper is that I'd like more exploration of the discrepancy in the result for the FGM attack, and to understand why this happens. This seems to be a significant drawback to the original paper if this replication is true, as the proposed approach is no longer robust to the existing attacks from the literature.

Other than that, this is a great paper and replication. Well done! Confidence : 3