[x] Line 9: perhaps "to characterize [or account for] uncertainty arising from models' heterogeneity"? The current "deal with" implies you're reconciling models, which doesn't seem accurate
[x] L. 20: "a prominent" or "a well-regarded"
[x] L. 21: "used for different"
[x] L. 23: "reproduce outputs and hindering the transparency of results"
[x] L. 34 example: what is awesomeProject.dat? Not clear. As currently written, this example doesn't work, because this object doesn't exist. Either clarify that this is the user's project output, or, better, provide a dat file included with gcamreport so that users (and reviewers) can easily test things out
[x] L. 45: "used in prominent"
[x] L. 51: "these kinds of assessments"
[x] L. 59: "by dividing"
[x] L. 66: "Here we present gcamreport, a powerful"
[x] L. 87: "facilitate data"
[x] This is an extremely brief description of the package's functionality, in particular of the dataset generation block. That is, I understand what's happening in general terms, but how does it happen? I gather that there are built-in mapping files to translate GCAM outputs to the IAMC template, but how does the user access them? Perhaps the text could walk through a specific example of this process
[x] Figure 2 is so small as to be almost illegible. Please think about ways to make it more useful for readers
[x] It would be good to mention performance. How long does this take to run?
https://github.com/openjournals/joss-reviews/issues/5975
awesomeProject.dat
? Not clear. As currently written, this example doesn't work, because this object doesn't exist. Either clarify that this is the user's project output, or, better, provide a dat file included with gcamreport so that users (and reviewers) can easily test things outgcamreport
, a powerful"