Closed jluethi closed 5 months ago
Another small nitpick: The test suite currently generates 270 warnings. Would be good to review if there's anything critical in there
Hello! My comment is also in the context of the JOSS review. I agree with @jluethi's comments. In addition, I think the existing tests can be improved. Many of the tests have a form similar to the one below:
I think this could be improved by testing the full array of values. Additionally, I'm not sure about the expected precision, but I find it a bit surprising to round to 2 decimal points. I would suggest having a look at the numpy testing suite for some utilities for doing array comparisons and float array comparisons: https://numpy.org/doc/stable/reference/routines.testing.html
Thanks!
Dear @jluethi and @kevinyamauchi,
Thank you very much for your valuable suggestions. I have now incorporated tests for most functions. @jluethi, as most functions (tools) in this package are independent of each other, I was not entirely sure how an integrated workflow test is better than individual tests and so I have not implemented it yet but happy to if you think it will be beneficial. For figures, I resorted to saving a plot and checking if it exists. @kevinyamauchi Your suggestion to test the entire array was extremely helpful. I have implemented that for most tools.
Working on your other comments now. Thank you both so much for your time reviewing this work.
36 passed, 16 warnings in 39.23s
Name Stmts Miss Cover
------------------------------------------------------------------
scimap/__init__.py 12 2 83%
scimap/external/__init__.py 0 0 100%
scimap/helpers/__init__.py 8 0 100%
scimap/helpers/addROI_omero.py 60 52 13%
scimap/helpers/animate.py 185 176 5%
scimap/helpers/classify.py 55 21 62%
scimap/helpers/downloadDemoData.py 29 24 17%
scimap/helpers/dropFeatures.py 42 25 40%
scimap/helpers/merge_adata_obs.py 47 21 55%
scimap/helpers/rename.py 17 1 94%
scimap/helpers/scimap_to_csv.py 48 24 50%
scimap/plotting/__init__.py 19 0 100%
scimap/plotting/addROI_image.py 148 131 11%
scimap/plotting/cluster_plots.py 66 52 21%
scimap/plotting/densityPlot2D.py 82 29 65%
scimap/plotting/distPlot.py 94 43 54%
scimap/plotting/foldchange.py 62 24 61%
scimap/plotting/gate_finder.py 97 86 11%
scimap/plotting/groupCorrelation.py 88 35 60%
scimap/plotting/heatmap.py 148 64 57%
scimap/plotting/image_viewer.py 81 70 14%
scimap/plotting/markerCorrelation.py 97 42 57%
scimap/plotting/pie.py 78 33 58%
scimap/plotting/spatialInteractionNetwork.py 103 27 74%
scimap/plotting/spatial_distance.py 83 50 40%
scimap/plotting/spatial_interaction.py 101 58 43%
scimap/plotting/spatial_pscore.py 47 11 77%
scimap/plotting/spatial_scatterPlot.py 97 27 72%
scimap/plotting/stacked_barplot.py 71 27 62%
scimap/plotting/umap.py 144 61 58%
scimap/plotting/voronoi.py 185 78 58%
scimap/preprocessing/__init__.py 4 0 100%
scimap/preprocessing/combat.py 38 14 63%
scimap/preprocessing/log1p.py 30 16 47%
scimap/preprocessing/mcmicro_to_scimap.py 83 33 60%
scimap/preprocessing/rescale.py 169 93 45%
scimap/tests/test_hl.py 43 5 88%
scimap/tests/test_pl.py 118 0 100%
scimap/tests/test_pp.py 40 0 100%
scimap/tests/test_tl.py 74 0 100%
scimap/tools/__init__.py 13 0 100%
scimap/tools/cluster.py 153 102 33%
scimap/tools/foldchange.py 81 21 74%
scimap/tools/phenotype_cells.py 158 39 75%
scimap/tools/spatial_aggregate.py 65 15 77%
scimap/tools/spatial_cluster.py 48 24 50%
scimap/tools/spatial_count.py 57 15 74%
scimap/tools/spatial_distance.py 40 14 65%
scimap/tools/spatial_expression.py 97 41 58%
scimap/tools/spatial_interaction.py 88 21 76%
scimap/tools/spatial_lda.py 83 16 81%
scimap/tools/spatial_pscore.py 77 15 81%
scimap/tools/spatial_similarity_search.py 127 42 67%
scimap/tools/umap.py 14 3 79%
------------------------------------------------------------------
TOTAL 4094 1823 55%
@ajitjohnson This is looking very promising! With the addition of your tests of whole arrays, consider my wish for integration testing covered as well. As mentioned, I mostly wanted to see validation of more than existence of an output, which you appear to have nicely implemented here now.
If you want to simplify your life in finding where the coverage is still missing, I recommend using a tool like Codecov, see https://about.codecov.io/
It both shows you which lines miss coverage & has a nice github action to report test coverage of any PR submitted :)
Hey @ajitjohnson. Thanks for the improvements to the testing suite! For the purpose of the JOSS review, the updates address my comments.
Hey scimap team (@ajitjohnson)
Very nice package that you've built here! As part of the review for the JOSS paper, I looked into the tests for the package a bit. I generated the following coverage report:
You have a very nice start with decent coverage of the tools. It's very nice that you have compact unit tests for the different functions. I think the tools would benefit from some sort of integration test, e.g. running a whole workflow and validating its output a bit more thoroughly than checking a single item in the returned adata object.
The preprocessing & plotting subpackages have very low test coverages at the moment. It would be great if you added a few tests for the different preprocessing functions. Testing plotting is admittedly harder. But I saw that some of your plotting functions also return data, so you could test that they correctly processed the data you passed to them and return valid objects. Your plotting functions seem to be doing some significant data preprocessing afterwards and you can then check whether the data has been processed correctly by the plotting function. See also discussion here for testing plotting functions: https://stackoverflow.com/questions/27948126/how-can-i-write-unit-tests-against-code-that-uses-matplotlib