openjournals / joss-reviews

Reviews for the Journal of Open Source Software
Creative Commons Zero v1.0 Universal
720 stars 38 forks source link

[REVIEW]: Ciclope: micro Computed Tomography to Finite Elements #4952

Closed editorialbot closed 1 year ago

editorialbot commented 1 year ago

Submitting author: !--author-handle-->@gianthk<!--end-author-handle-- (gianluca iori) Repository: https://github.com/gianthk/ciclope Branch with paper.md (empty if default branch): Version: v1.3.4 Editor: !--editor-->@danasolav<!--end-editor-- Reviewers: @dankusanovic, @jacobmerson Archive: 10.5281/zenodo.7716943

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/8f9176ffe46a0ef54f93ad860ff7b731"><img src="https://joss.theoj.org/papers/8f9176ffe46a0ef54f93ad860ff7b731/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/8f9176ffe46a0ef54f93ad860ff7b731/status.svg)](https://joss.theoj.org/papers/8f9176ffe46a0ef54f93ad860ff7b731)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@dankusanovic & @jacobmerson, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @danasolav know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @jacobmerson

📝 Checklist for @dankusanovic

editorialbot commented 1 year ago

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf
editorialbot commented 1 year ago
Software report:

github.com/AlDanial/cloc v 1.88  T=0.21 s (280.2 files/s, 364097.5 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          10            476            935           1209
MATLAB                          15            207            137           1030
Jupyter Notebook                10              0          68671            887
Markdown                        13            143              8            733
TeX                              1             42              0            583
YAML                             3              6             14             58
TOML                             1              4              0             55
reStructuredText                 3             33             50             38
DOS Batch                        1              8              1             26
make                             1              4              7              9
-------------------------------------------------------------------------------
SUM:                            58            923          69823           4628
-------------------------------------------------------------------------------

gitinspector failed to run statistical information for the repository
editorialbot commented 1 year ago

Wordcount for paper.md is 1207

editorialbot commented 1 year ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

editorialbot commented 1 year ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/S0021-9290(03)00257-4 is OK
- 10.1016/S8756-3282(02)00736-6 is OK
- 10.1016/j.jbiomech.2020.110205 is OK
- 10.1115/1.2746368 is OK
- 10.1016/j.jbiomech.2008.02.032 is OK
- 10.1115/1.2146001 is OK
- 10.1007/s10237-019-01254-x is OK
- 10.1371/journal.pone.0180151 is OK
- 10.1007/978-3-642-29843-1_56 is OK
- 10.1016/j.parco.2011.08.001 is OK
- 10.1016/j.jmbbm.2020.104190 is OK
- 10.1016/j.jmbbm.2018.06.022 is OK
- 10.1016/j.medntd.2021.100088 is OK
- 10.1115/1.4028968 is OK
- 10.1007/s10237-007-0109-7 is OK
- 10.1016/j.bonr.2022.101179 is OK
- 10.1016/j.jmbbm.2022.105303 is OK
- 10.1101/2021.11.30.470675 is OK
- 10.1016/j.cmpb.2022.106764 is OK
- 10.1016/j.jbiomech.2003.10.002 is OK
- 10.1007/s10237-008-0125-2 is OK
- 10.5281/zenodo.1173115 is OK
- 10.5281/zenodo.5564818 is OK
- 10.1107/S160057751401604X is OK
- 10.1002/adem.201900080 is OK
- 10.1016/j.jmbbm.2016.11.020 is OK
- 10.1080/10255840701848756 is OK
- 10.1080/10255840410001656408 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.7717/peerj.453 is OK

MISSING DOIs

- 10.1109/tsmc.1979.4310076 may be a valid DOI for title: A threshold selection method from gray-level histograms
- 10.1109/tsmc.1978.4310039 may be a valid DOI for title: Picture thresholding using an iterative selection method

INVALID DOIs

- https://doi.org/10.1002/cnm.2941 is INVALID because of 'https://doi.org/' prefix
jacobmerson commented 1 year ago

Review checklist for @jacobmerson

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

dankusanovic commented 1 year ago

Review checklist for @dankusanovic

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

jacobmerson commented 1 year ago

@gianthk A few issues as I work through the review

  1. Are there tests that can be run either automatically or manually (with documented steps?). I do not see this
  2. Since your examples include data from humans and animals please review the policies here: https://joss.readthedocs.io/en/latest/policies.html?highlight=animal#joss-policies and add the appropriate statements to your paper and documentation.
  3. You are missing data needed to reproduce any of your examples. See https://github.com/gianthk/ciclope/issues/9
  4. Missing community guidelines https://github.com/gianthk/ciclope/issues/10
gianthk commented 1 year ago

@jacobmerson thank you for the revisions! This is my first JOSS paper. how does it work? can I start working on these already, commit changes and solve the issues you opened?

gianthk commented 1 year ago
1. Are there tests that can be run either automatically or manually (with documented steps?). I do not see this

@jacobmerson the examples cover all steps of software functionality. we will upload the test dataset to reproduce the examples. on top of this, are automated tests mandatory?

jacobmerson commented 1 year ago

@jacobmerson thank you for the revisions! This is my first JOSS paper. how does it work? can I start working on these already, commit changes and solve the issues you opened?

Yes! Please address any issues that we bring up through the review process as we go. One nice thing about JOSS is that things can be iterative.

@jacobmerson the examples cover all steps of software functionality. we will upload the test dataset to reproduce the examples. on top of this, are automated tests mandatory?

Please take a look at the review criteria here. You do provide a lot of examples. However, based on my interpretation of the review criteria these examples alone are not enough. You need to have some mechanism for me to check the correctness of the outputs either manual or automated (preferred). You don't necessarily need unit testing, but you do need integration tests.

In other words, given some set of input files and parameters, do I get the correct input files for Abaqus/Calculix out? As much of your public API as possible should have some form of testing.

gianthk commented 1 year ago

@jacobmerson OK got it. we will need some time to address all issues

dankusanovic commented 1 year ago

@gianthk I agree with @jacobmerson . In my opinion, the paper and examples read and execute well. The pip installation is quite nice. However, there is no test validation (automatic) if the elements are properly implemented. I suggest including a few that shows that Hexa, Tetra, Quad, and so on work well. On the same line, it will be nice to compare to some other software to see performance and validation of results. Abaqus and Ansys could help if a license is available, otherwise FreeFEM, FEniCs, Deal.II are opensource possibilities

gianthk commented 1 year ago

@jacobmerson @dankusanovic here we go thank you very much for reviewing all ciclope functionalities! It took us some time but it was a chance to find and debug more issues with installation.

installation

unit testing

test_data for examples and tests

the new release v1.3.1 should incorporate all the items discussed in the review.

danasolav commented 1 year ago

@jacobmerson @dankusanovic , could you please review the changes reported by @gianthk and update your checklists accordingly, or raise new issues if you have any?

jacobmerson commented 1 year ago

@danasolav I'm hoping to get this done within the next week.

jacobmerson commented 1 year ago

@gianthk

gianthk commented 1 year ago
* [ ]  Update the paper with the appropriate statements for human/animal testing? [statement for animal/human samples for JOSS paper gianthk/ciclope#14](https://github.com/gianthk/ciclope/issues/14)

Yes we did. The latest paper version is in this Action's run.

The microCT bone dataset used in the examples is part of the public collection of the Living Human Digital Library (LHDL) project, a project financed by the European Commission (project number: FP6-IST 026932). Human tissues in the LHDL project were collected according to the body donation program of Universitè Libre de Bruxelles (ULB), partner of the project.

jacobmerson commented 1 year ago

@editorialbot generate paper

editorialbot commented 1 year ago

I'm sorry human, I don't understand that. You can see what commands I support by typing:

@editorialbot commands

jacobmerson commented 1 year ago

@editorialbot commands

editorialbot commented 1 year ago

Hello @jacobmerson, here are the things you can ask me to do:


# List all available commands
@editorialbot commands

# Get a list of all editors's GitHub handles
@editorialbot list editors

# Check the references of the paper for missing DOIs
@editorialbot check references

# Perform checks on the repository
@editorialbot check repository

# Adds a checklist for the reviewer using this command
@editorialbot generate my checklist

# Set a value for branch
@editorialbot set joss-paper as branch

# Generates the pdf paper
@editorialbot generate pdf

# Generates a LaTeX preprint file
@editorialbot generate preprint

# Get a link to the complete list of reviewers
@editorialbot list reviewers
jacobmerson commented 1 year ago

@editorialbot generate pdf

editorialbot commented 1 year ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

jacobmerson commented 1 year ago

@danasolav can you help me understand if the statements made by the authors about human/animal samples are adequate for JOSS?

They provide a reference to the public access database that they retrieved the imaging samples from (and did not collect the samples themselves). However, the JOSS guidelines require

The manuscript must include complete reporting of the data, including the ways that the study was approved and relevant details of the sample. Authors are required to report either the ARRIVE or PREPARE guidelines for animal research in the submission.

However they do not mention the ARRIVE or PREPARE guidelines. From a pragmatic standpoint it seems that referencing the public database is adequate since they do not collect the samples themselves.

dankusanovic commented 1 year ago

Just waiting for @jacobmerson inquiry. On my side, if that is appropriate, then the paper is ready to go. The list has been updated accordingly.

jacobmerson commented 1 year ago

@gianthk @danasolav @dankusanovic Just went through the JOSS requirements again and it does specify "original" research needs to include those guidelines however, with their paper edits it is clear that this is not original animal/human research.

They should be all set.

jacobmerson commented 1 year ago

@gianthk A few comments on the paper. I don't consider these to be blockers, but they might improve the quality of your paper:

Line 33-34: you may want to indicate how standard microCT datasets can be converted to numpy arrays. This discussion may be too much for the paper, but you should have this in the documentation. Or, at least a description of the structure of the expected inputs. A t x n x n array where t the number of frames through the thickness and n is the number of pixels across each edge. Line 45: you should say what a .inp file is. i.e., say that it is the input file format for Abaqus/Calculix Line 50: why is material mapping bolded? Line 56-65: You mention that there are components you rely on for visualization, but have not included any description of what visualization capabilities exist within Ciclope

jacobmerson commented 1 year ago

@gianthk @danasolav @dankusanovic My review is complete. I would like to recommend this work for publication. This paper provides open source tools to aid in the construction of finite element models from microCT image stack data including basic tools for segmentation, as well as the construction of input files for Abaqus and Calculix. These types of pipelines are challenging to construct, but lay the foundation for important advancements in the use of patient specific data in research and clinical settings.

I have suggested some minimal clarification edits to the author, however I don't consider these to be required.

gianthk commented 1 year ago

@jacobmerson thanks for the comments above I will go through the minor edits tomorrow

gianthk commented 1 year ago

Line 33-34: you may want to indicate how standard microCT datasets can be converted to numpy arrays

  • [X] github
  • [X] readthedocs

Line 45: you should say what a .inp file is. i.e., say that it is the input file format for Abaqus/Calculix

  • [X] done

Line 50: why is material mapping bolded?

  • [X] bold removed

Line 56-65: You mention that there are components you rely on for visualization, but have not included any description of what visualization capabilities exist within Ciclope

  • [X] add link to jupyter notebooks in the paper
  • [X] ..on readthedocs
  • [X] ..on the github readme
danasolav commented 1 year ago

@dankusanovic please check your last unchecked box: Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

danasolav commented 1 year ago

@editorialbot generate pdf

dankusanovic commented 1 year ago

Dear @danasolav I am sorry for the delay; I just completely forgot it! It is done now. Thanks for the reminder

danasolav commented 1 year ago

@editorialbot generate pdf

editorialbot commented 1 year ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

danasolav commented 1 year ago

@gianthk, please refer to my comments regarding the paper in this issue.

gianthk commented 1 year ago

@danasolav please see all our replies to your open issue and let us know if you have more corrections/suggestions

gianthk commented 1 year ago

@danasolav did you have the time to review our replies to your open issue?

danasolav commented 1 year ago

@gianthk sorry for the delay, I closed the issue now.

danasolav commented 1 year ago

@editorialbot generate pdf

editorialbot commented 1 year ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

danasolav commented 1 year ago

@editorialbot check references

editorialbot commented 1 year ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/S0021-9290(03)00257-4 is OK
- 10.1016/S8756-3282(02)00736-6 is OK
- 10.1016/j.jbiomech.2020.110205 is OK
- 10.1115/1.2746368 is OK
- 10.1016/j.jbiomech.2008.02.032 is OK
- 10.1115/1.2146001 is OK
- 10.1007/s10237-019-01254-x is OK
- 10.1371/journal.pone.0180151 is OK
- 10.1002/cnm.2941 is OK
- 10.1007/978-3-642-29843-1_56 is OK
- 10.1016/j.parco.2011.08.001 is OK
- 10.1016/j.jmbbm.2020.104190 is OK
- 10.1016/j.jmbbm.2018.06.022 is OK
- 10.1016/j.medntd.2021.100088 is OK
- 10.1115/1.4028968 is OK
- 10.1007/s10237-007-0109-7 is OK
- 10.1016/j.bonr.2022.101179 is OK
- 10.1016/j.jmbbm.2022.105303 is OK
- 10.1101/2021.11.30.470675 is OK
- 10.1016/j.cmpb.2022.106764 is OK
- 10.1016/j.jbiomech.2003.10.002 is OK
- 10.1007/s10237-008-0125-2 is OK
- 10.5281/zenodo.1173115 is OK
- 10.5281/zenodo.5564818 is OK
- 10.1107/S160057751401604X is OK
- 10.1002/adem.201900080 is OK
- 10.1016/j.jmbbm.2016.11.020 is OK
- 10.1080/10255840701848756 is OK
- 10.1080/10255840410001656408 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.7717/peerj.453 is OK

MISSING DOIs

- 10.1109/tsmc.1979.4310076 may be a valid DOI for title: A threshold selection method from gray-level histograms
- 10.1109/tsmc.1978.4310039 may be a valid DOI for title: Picture thresholding using an iterative selection method
- 10.1109/tsmc.1979.4310076 may be a valid DOI for title: A threshold selection method from gray-level histograms

INVALID DOIs

- None
danasolav commented 1 year ago

@gianthk please check the missing DOI and consider adding them to your references.

danasolav commented 1 year ago

@gianthk @jacobmerson @dankusanovic Thank you all for your good work and for recommending acceptance of this submission. @gianthk, after correcting the missing DOIs, please issue a new tagged release and archive it (on Zenodo, figshare, or other). Then, please post the version number and archive DOI here.

gianthk commented 1 year ago

@editorialbot check references

editorialbot commented 1 year ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/S0021-9290(03)00257-4 is OK
- 10.1016/S8756-3282(02)00736-6 is OK
- 10.1016/j.jbiomech.2020.110205 is OK
- 10.1115/1.2746368 is OK
- 10.1016/j.jbiomech.2008.02.032 is OK
- 10.1115/1.2146001 is OK
- 10.1007/s10237-019-01254-x is OK
- 10.1371/journal.pone.0180151 is OK
- 10.1002/cnm.2941 is OK
- 10.1007/978-3-642-29843-1_56 is OK
- 10.1016/j.parco.2011.08.001 is OK
- 10.1016/j.jmbbm.2020.104190 is OK
- 10.1016/j.jmbbm.2018.06.022 is OK
- 10.1016/j.medntd.2021.100088 is OK
- 10.1115/1.4028968 is OK
- 10.1007/s10237-007-0109-7 is OK
- 10.1016/j.bonr.2022.101179 is OK
- 10.1016/j.jmbbm.2022.105303 is OK
- 10.1101/2021.11.30.470675 is OK
- 10.1016/j.cmpb.2022.106764 is OK
- 10.1016/j.jbiomech.2003.10.002 is OK
- 10.1109/tsmc.1979.4310076 is OK
- 10.1109/tsmc.1978.4310039 is OK
- 10.1007/s10237-008-0125-2 is OK
- 10.5281/zenodo.1173115 is OK
- 10.5281/zenodo.5564818 is OK
- 10.1107/S160057751401604X is OK
- 10.1002/adem.201900080 is OK
- 10.1016/j.jmbbm.2016.11.020 is OK
- 10.1080/10255840701848756 is OK
- 10.1080/10255840410001656408 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.7717/peerj.453 is OK

MISSING DOIs

- None

INVALID DOIs

- None
gianthk commented 1 year ago

@danasolav the new release is v1.3.4. Zenodo DOI: https://doi.org/10.5281/zenodo.7716943

danasolav commented 1 year ago

@editorialbot set https://doi.org/10.5281/zenodo.7716943 as archive