Closed obaysal closed 6 years ago
The authors apply for the Replicated badge, but unfortunately the artifact does not meet the requirements.
According to the ACM definition, Replicated badges must fulfill the following (https://2018.fseconference.org/track/fse-2018-Artifacts): "Available + main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author."
Regarding the second part, the submitted artifact lacks evidence of any external study which replicates the artifact results.
Regarding the first part, "Available" implies: "Functional + Placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided.". The submitted artifact does not provide any referentce to the DOI of the artifact.
In addition, for granting the "Resuable" badge: "Functional + very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to."
The accompanied documentation is limited . For example:
It misses description of system requirements. It is not mentioned that the artifact is intended to run in Linux. The instructions to install and run the tool are Linux commands but it is not mentioned in the documentation.
Some stepts are not properly described. For example, in the INSTALL file, the artifact requires the installation of an existing tool Z3. It is unclear that this is an external dependency. The documentation could be better structured, for example with different sections describing system requirements, tool dependencies, and tool install.
In INSTALL file, step 2. "To compile ALPS, please run make
." . The step is vague. The step should include that the make command must be executed inside the z3_env/z3 folder inside the root folder of the project. Otherwise, the command fails.
In "Re-run main experiment" section. The documentation mentions to execute the command: "scripts/run.py alps data/ data/templates [desired directory for logs]" If a log directory is not provided, the command fails. Typically, "[ ]" is the notation used to indicate optional parameters. But, in this case it is a compulsory parameter.
I could run the tool, but it was not straightforward to install and run for the issues mentioned above. I created a Virtual Machine with Linux to be able to run the tool.
Consequently, the artifact meets the requirements for the Functional badge.
We thank the reviewers for their helpful feedback, which we will incorporate in the revision of our artifact.
It seems that we misunderstood certain requirements of Replicated badge. For example, we assumed that the artifact evaluation could serve as "an external study which replicates the artifact results". As for "Available", we indeed provided a publicly accessible link, which is a public github repo: https://github.com/XujieSi/fse18-artifact-183/releases, during the submission. We are willing to create a DOI for our artifact, if that is the only acceptable standard. Due to confusion of requirements of various badges, we are also open to other badges that are most appropriate for our artifact in the view of reviewers.
We agree that we should make it explicit that the submitted artifact can be easily set up on Linux platform. However, it does NOT require installation of Z3 or have to be executed inside z3_env/z3 folder.
Because a pre-built Z3 library on Linux platform is already included and the provided script "setenv" should set up environments properly. A single make
command after setting environments (i.e. run . setenv
) should suffice (as described in step-2).
For platforms other than Linux, one does need to compile the Z3 library and adjust the library path of Z3 in Makefile accordingly.
Is there a confusion in badge requirements--- we stated them as clear as we could in our CFP. can you point to any confusing text that we need to fix?
@timm Thanks for clarifying!
Specifically, I am confused by the requirement of Available badge: "A DOI or link", which sounds like either DOI or some publicly accessible URL provided by cloud vendors (e.g. Dropbox, Google Drive, GitHub repo, etc.) should be fine. But the actual expectation seems to be only DOI, as pointed out by our reviewer.
Similarly, the requirement of Replicated badge: "by a person or team other than the authors" is little confusing to me. Could the anonymous reviewers be the "person or team other than the authors"? Or we (the authors) need to contact some person or team and show evidence that the artifacts are reproducible before the artifact evaluation?
@XujieSi A DOI is an archive-quality name for a thing. So you can get a DOI by putting your artifact in some archival service -- e.g. Zenodo, which is popular in the SE community. That's what I did. A GitHub repository is not archive-quality, since you can forcibly push to it and rewrite the history (perils of mining git). It looks like GitHub actually integrates with Zenodo which might be easier to use than using Zenodo independently.
@davisjam Thanks for explaining the DOI, which I did not know much about. I will prepare the DOI for our artifact. Then, perhaps we should just say "A DOI to" instead of "A DOI or link to" in the description of Available badge, because "link" could mean a lot of things.
@XujieSi See also this recent remark from @timm.
Sorry for my late review.
At least for me, it is difficult to reproduce the main experiment.
I had tried to perform "Examine experimental logs" in the README.md file, however, I got an error message as follows:
Traceback (most recent call last):
File "scripts/parse_log.py", line 6, in
The other steps were successfully done, and it was easy for me. So I believe that this paper meets the requirement of "Functional" badge.
In addition, the authors have made their tool publicly available on Github repository with doi. Thus, I believe the artifact can get their "Available" badge.
https://github.com/researchart/fse18/tree/master/submissions/datalog