pyOpenSci / software-submission

Submit your package for review by pyOpenSci here! If you have questions please post them here: https://pyopensci.discourse.group/
93 stars 36 forks source link

Stingray Submission #201

Open matteobachetti opened 3 months ago

matteobachetti commented 3 months ago

Submitting Author: Name @matteobachetti All current maintainers: @dhuppenkothen @mgullik @jdswinbank @matteolucchini1 Package Name: Stingray One-Line Description of Package: A spectral-timing software package for astrophysical X-ray (and other) data Repository Link: https://github.com/stingraysoftware/stingray

Version submitted: 2.1 EiC: @cmarmo Editor: @hamogu
Reviewer 1: TBD
Reviewer 2: TBD
Archive: TBD JOSS DOI: TBD Version accepted: TBD Date accepted (month/day/year): TBD


Code of Conduct & Commitment to Maintain Package

Description

Stingray is a Python library for "spectral timing", i.e. time-series analysis techniques that can be used to study how variability changes or correlates between different energy bands/wavelengths. It is Astropy-affiliated, and with an ever growing user base now comprising hundreds of researchers around the globe.

Scope

Domain Specific

Community Partnerships

If your package is associated with an existing community please check below:

[^1]: Please fill out a pre-submission inquiry before submitting a data visualization package.

This package is mostly focused on the analysis of new data from high-energy missions. We added data validation and testing because it can be used as a quick look tool to point out possible anomalies in observations (e.g. solar or other background flares).

The target audience is principally researchers and students of X-ray and multi-wavelength astronomy. This package fills a void for a free and open source (spectral-)timing package for X-ray astronomy. XRONOS, formerly maintained by HEASARC, is currently unmaintained, and the analysis of high-energy timeseries is done mostly with proprietary software or mission-specific packages. We provide a Python package that eases the learning curve for newcomers, also thanks to extensive tutorials based on Jupyter notebooks, and provides experts with a powerful, robust library for their analysis. We provide software to analyze astronomical time series and do a number of things, including periodicity searches, time lag calculations, covariance spectra, power spectral modeling.

Our package is arguably the most well-known Python package for X-ray spectral timing.

https://github.com/pyOpenSci/software-submission/issues/195

Technical checks

For details about the pyOpenSci packaging requirements, see our packaging guide. Confirm each of the following by checking the box. This package:

Publication Options

JOSS Checks - [x] The package has an **obvious research application** according to JOSS's definition in their [submission requirements][JossSubmissionRequirements]. Be aware that completing the pyOpenSci review process **does not** guarantee acceptance to JOSS. Be sure to read their submission requirements (linked above) if you are interested in submitting to JOSS. - [x] The package is not a "minor utility" as defined by JOSS's [submission requirements][JossSubmissionRequirements]: "Minor ‘utility’ packages, including ‘thin’ API clients, are not acceptable." pyOpenSci welcomes these packages under "Data Retrieval", but JOSS has slightly different criteria. - [ ] The package contains a `paper.md` matching [JOSS's requirements][JossPaperRequirements] with a high-level description in the package root or in `inst/`. **We have a previous JOSS paper describing the package. Is it acceptable to have a new paper, including the many improvements made in the last five years?** - [x] The package is deposited in a long-term repository with the DOI: *Note: JOSS accepts our review as theirs. You will NOT need to go through another full review. JOSS will only review your paper.md file. Be sure to link to this pyOpenSci issue when a JOSS issue is opened for your package. Also be sure to tell the JOSS editor that this is a pyOpenSci reviewed package once you reach this step.*

Are you OK with Reviewers Submitting Issues and/or pull requests to your Repo Directly?

This option will allow reviewers to open smaller issues that can then be linked to PR's rather than submitting a more dense text based review. It will also allow you to demonstrate addressing the issue via PR links.

Confirm each of the following by checking the box.

Please fill out our survey

P.S. Have feedback/comments about our review process? Leave a comment here

Editor and Review Templates

The editor template can be found here.

The review template can be found here.

cmarmo commented 3 months ago

Editor in Chief checks

Hi there! I'm Chiara and I am taking care of the Editor in chief role for the next three months. Thank you for submitting your package for pyOpenSci review. Below are the basic checks that your package needs to pass to begin our review. If some of these are missing, we will ask you to work on them before the review process begins.

Please check our Python packaging guide for more information on the elements below.



Editor comments

Stingray is in excellent form! 💪 Congratulations! I am going to check the availability of our Astropy editors and we will be back to you shortly.

cmarmo commented 3 months ago

Hello @matteobachetti I am pleased to announce that @hamogu will take care of your submission as editor. Thank you Moritz! I'm wishing to all of you a nice review process! :)

hamogu commented 3 months ago

Editor response to review:


Editor comments

:wave: Hi @taldcroft and @masonng-astro! Thank you for volunteering to review for pyOpenSci!

Please fill out our pre-review survey

Before beginning your review, please fill out our pre-review survey. This helps us improve all aspects of our review and better understand our community. No personal data will be shared from this survey - it will only be used in an aggregated format by our Executive Director to improve our processes and programs.

The following resources will help you complete your review:

  1. Here is the reviewers guide. This guide contains all of the steps and information needed to complete your review.
  2. Here is the review template that you will need to fill out and submit here as a comment, once your review is complete.

Please get in touch with any questions or concerns! Your review is due:

Reviewers: Tom Aldcroft and Mason Ng Due date: Sept 6th, 2024

hamogu commented 3 months ago

We have a previous JOSS paper describing the package. Is it acceptable to have a new paper, including the many improvements made in the last five years?

It's up to JOSS, not us, how exactly they handle that. Yo save everyone time, JOSS will generally accept our review as theirs, because there is really no need for a second set of reviewers that do essentially the same checks that our reviewers just did. However the final decision to accept into JOSS is up to JOSS.

So, I think there are two ways this can be done, at out choice:

Option 2 is faster (you don't have to wait for the JOSS response first), but carries the risk that the JOSS editor decides not to publish it later.

Note that JOSS says: "we also review submissions that are significant contributions made to existing packages". Based on this, I suggest option 2 (you add a 'paper.md` here before we review) and I will specifically instruct our reviewers to comment on if they judge the development in the last 5 years to be a "significant contribution".

matteobachetti commented 3 months ago

@hamogu we are working on the JOSS paper draft here: https://github.com/StingraySoftware/stingray/pull/829 The text will not change much, we are working mostly on the author list and acknowledgements. The new paper is paper2.md, and the bibliography joss2.bib (because we have the 2019 paper in the same directory)

hamogu commented 2 months ago

Just let me know when the paper is ready and I'll let the reviewers know to start their review. In the meantime, I'm emailing reviewers to find people to do the review when you are ready.

matteobachetti commented 2 months ago

@hamogu the paper is ready, but I'm still missing a couple OrcIDs and affiliations. Does it have to be merged or the link to the PR is sufficient?

hamogu commented 2 months ago

Just letting you know that I'm in the process of looking for reviewers; both I and some potential reviewers are on summer travel, but I'm hoping to get you two reviewers soon to get this started.

hamogu commented 1 month ago

Just letting you know that I'm still looking for reviewers. Seems that everyone is on vacation right now...

matteobachetti commented 1 month ago

Understandable 😅 No worries!

hamogu commented 1 month ago

:wave: Hi @taldcroft and @masonng-astro! Thanks for agreeing to do a review of stingray for us. Generic instructions (pre-review survey, etc.) are above: https://github.com/pyOpenSci/software-submission/issues/201#issuecomment-2204465146

In addition, note that this package is also submitting to JOSS and JOSS will accept our review and not do a separate review. However, stingray already had a JOSS publication a few years ago. So, it would be very helpful to note in your review if you deem the new features added since then as new "substantial contribution" that merits a new JOSS publication. You can find the paper draft for JOSS in https://github.com/StingraySoftware/stingray/pull/829

Please reach out to me if you have any questions!

hamogu commented 4 weeks ago

👋 Hi @taldcroft and @masonng-astro! I'm hoping to get your reviews for this package soon! If you have any problems or need help or guidance in how to do a review, please reach out to me any time!

masonng-astro commented 4 weeks ago

(Apologies for any erroneous formatting)

Package Review

Please check off boxes as applicable, and elaborate in comments below. Your review is not limited to these topics, as described in the reviewer guide*

Documentation

The package includes all the following forms of documentation:

Readme file requirements The package meets the readme requirements below:

The README should include, from top to bottom:

Usability

Reviewers are encouraged to submit suggestions (or pull requests) that will improve the usability of the package as a whole. Package structure should follow general community best-practices. In general please consider whether:

Functionality

For packages also submitting to JOSS

Note: Be sure to check this carefully, as JOSS's submission requirements and scope differ from pyOpenSci's in terms of what types of packages are accepted.

The package contains a paper.md matching JOSS's requirements with:

Final approval (post-review)

Estimated hours spent reviewing: 6

---

Review Comments

As a frequent user of Stingray, I have casually followed the development of Stingray over the years. It has evolved from its previous iteration and from past projects (Stingray, HENDRICS, etc.) into a full software suite of spectral timing tools applicable to event data. I have always found the documentation to be extensive and comprehensive, with pedagogical Jupyter notebook-style tutorials to get started. My comments are thus fairly minor. I also note the absence of several items from the checklist above, though please let me know if I missed something.

On the JOSS publication, I believe the addition of new features, the integration of additional data formats, and increased performance of the codebase merits a new JOSS publication. I just have very minor comments below.

I have glanced through all of the notebooks, and I encountered an issue in the “Bexvar tutorial” notebook. This might be my set-up, but I get the error that

```python
ImportError: dlopen(/opt/homebrew/anaconda3/lib/python3.10/site-packages/ultranest/mlfriends.cpython-310-darwin.so, 0x0002): tried: '/Users/masonng/Documents/MIT/Research/software/heasoft-6.33/aarch64-apple-darwin22.1.0/mlfriends.cpython-310-darwin.so' (no such file), '/Users/masonng/Documents/MIT/Research/software/heasoft-6.33/aarch64-apple-darwin22.1.0/lib/mlfriends.cpython-310-darwin.so' (no such file), '/opt/homebrew/anaconda3/lib/python3.10/site-packages/ultranest/mlfriends.cpython-310-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/opt/homebrew/anaconda3/lib/python3.10/site-packages/ultranest/mlfriends.cpython-310-darwin.so' (no such file), '/opt/homebrew/anaconda3/lib/python3.10/site-packages/ultranest/mlfriends.cpython-310-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))
```

which I wonder if it is because there is some incompatibility with the arm64 architecture. I did install UltraNest with “arch -arm64” but the file “mlfriends.cpython-310-darwin.so” is still in the x86_64 architecture. Apparently UltraNest does not explicitly support arm64 architecture (PyPI project here)?

I appreciate the “dependencies” sub-section under “installation instructions.” I would just like to include “UltraNest” to the list. Have there been tests done with the Apple Silicon chip-equipped devices as well? I encountered the above error with UltraNest for example, with my MacBook Pro M1; or this could be a personal problem.

```
>>> stingray.test()
=========================================================================================== test session starts ============================================================================================
platform darwin -- Python 3.10.12, pytest-7.1.2, pluggy-1.5.0

Running tests with stingray version 2.1.
Running tests in /opt/homebrew/anaconda3/lib/python3.10/site-packages/stingray.

Date: 2024-09-05T15:05:00

Platform: macOS-13.0.1-arm64-arm-64bit

Executable: /opt/homebrew/anaconda3/bin/python

Full Python Version:
3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:41:52) [Clang 15.0.7 ]

encodings: sys: utf-8, locale: UTF-8, filesystem: utf-8
byteorder: little
float info: dig: 15, mant_dig: 15

Package versions:
Numpy: 1.23.5
Scipy: 1.10.0
Matplotlib: 3.7.0
h5py: 3.7.0
scikit-image: 0.19.3

Using Astropy options: remote_data: none.

rootdir: /opt/homebrew/anaconda3/lib/python3.10/site-packages/stingray
plugins: anyio-3.5.0, remotedata-0.4.1, doctestplus-1.2.1, astropy-header-0.2.2, asdf-3.4.0
collected 1650 items / 1 skipped

../deadtime/tests/test_fad.py ................                                                                                                                                                       [  0%]
../deadtime/tests/test_models.py ...............                                                                                                                                                     [  1%]
../mission_support/tests/test_mission_support.py ..                                                                                                                                                  [  2%]
../modeling/tests/test_parameterestimation.py ............s.........s...............................                                                                                                 [  5%]
../modeling/tests/test_posterior.py .....................................................                                                                                                            [  8%]
../modeling/tests/test_scripts.py .......                                                                                                                                                            [  8%]
../pulse/overlapandsave/test_ols.py ss...                                                                                                                                                            [  9%]
../pulse/tests/test_accelsearch.py ......                                                                                                                                                            [  9%]
../pulse/tests/test_modeling.py ........                                                                                                                                                             [ 10%]
../pulse/tests/test_pulse.py .sss...............................                                                                                                                                     [ 12%]
../pulse/tests/test_search.py ..............................                                                                                                                                         [ 14%]
../simulator/tests/test_base.py ..                                                                                                                                                                   [ 14%]
../simulator/tests/test_simulator.py ........................................................                                                                                                        [ 17%]
../simulator/tests/test_transfer.py ................                                                                                                                                                 [ 18%]
test_base.py .............................................................................................................................                                                           [ 26%]
test_bexvar.py sssss..s.s.                                                                                                                                                                           [ 26%]
test_bispectrum.py ............................                                                                                                                                                      [ 28%]
test_covariancespectrum.py ......................                                                                                                                                                    [ 29%]
test_crosscorrelation.py ...........................                                                                                                                                                 [ 31%]
test_crossspectrum.py .............................................................................................................................................................................. [ 41%]
...........................................                                                                                                                                                          [ 44%]
test_events.py ..............................................................                                                                                                                        [ 48%]
test_filters.py ..........                                                                                                                                                                           [ 48%]
test_fourier.py .................................................................................................................................................................................... [ 59%]
........                                                                                                                                                                                             [ 60%]
test_gti.py .........................................................                                                                                                                                [ 63%]
test_io.py ....F..............                                                                                                                                                                       [ 64%]
test_lightcurve.py .....s..............................................................................................ss........................................ss.......                           [ 74%]
test_lombscargle.py ...............................................                                                                                                                                  [ 76%]
test_multitaper.py ......................................                                                                                                                                            [ 79%]
test_power_colors.py .............                                                                                                                                                                   [ 80%]
test_powerspectrum.py .................................................................................................................................................................              [ 89%]
test_sampledata.py ....                                                                                                                                                                              [ 90%]
test_spectroscopy.py xX......                                                                                                                                                                        [ 90%]
test_stats.py .......................................................                                                                                                                                [ 93%]
test_utils.py ...................................................s..........                                                                                                                         [ 97%]
test_varenergyspectrum.py ........................................                                                                                                                                   [100%]

================================================================================================= FAILURES =================================================================================================
______________________________________________________________________________ TestIO.test_event_file_read_and_automatic_sort ______________________________________________________________________________

self = <stingray.tests.test_io.TestIO object at 0x167545360>

    def test_event_file_read_and_automatic_sort(self):
        """Test event file reading."""
        fname = os.path.join(datadir, "monol_testA_calib.evt")
        with pytest.warns(AstropyUserWarning, match="No valid GTI extensions"):
            evdata = load_events_and_gtis(fname)
        fname_unsrt = os.path.join(datadir, "monol_testA_calib_unsrt.evt")
>       with pytest.warns(UserWarning, match="not sorted. Sorting them for you"):
E       Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted. The list of emitted warnings is: [].

test_io.py:77: Failed
============================================================================================= warnings summary =============================================================================================
../../_pytest/config/__init__.py:1129
  /opt/homebrew/anaconda3/lib/python3.10/site-packages/_pytest/config/__init__.py:1129: PytestAssertRewriteWarning: Module already imported so cannot be rewritten: asdf
    self._mark_plugins_for_rewrite(hook)

../deadtime/tests/test_fad.py:22: 1 warning
../deadtime/tests/test_models.py:14: 1 warning
../modeling/tests/test_parameterestimation.py:34: 1 warning
../pulse/overlapandsave/test_ols.py:88: 1 warning
../pulse/tests/test_accelsearch.py:6: 1 warning
../pulse/tests/test_pulse.py:32: 1 warning
../pulse/tests/test_pulse.py:55: 1 warning
../pulse/tests/test_search.py:14: 1 warning
test_bexvar.py:11: 1 warning
test_crossspectrum.py:331: 1 warning
test_crossspectrum.py:909: 1 warning
test_crossspectrum.py:921: 1 warning
test_crossspectrum.py:938: 1 warning
test_crossspectrum.py:958: 1 warning
test_lightcurve.py:1605: 1 warning
test_multitaper.py:13: 1 warning
test_powerspectrum.py:210: 1 warning
test_spectroscopy.py:93: 1 warning
test_varenergyspectrum.py:171: 1 warning
test_varenergyspectrum.py:358: 1 warning
  PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see <https://docs.pytest.org/en/stable/how-to/mark.html>

../../pint/pulsar_mjd.py:57
  RuntimeWarning: This platform does not support extended precision floating-point, and PINT will run at reduced precision.

deadtime/tests/test_fad.py: 2 warnings
tests/test_base.py: 3 warnings
tests/test_crossspectrum.py: 1 warning
tests/test_events.py: 2 warnings
tests/test_lightcurve.py: 1 warning
tests/test_powerspectrum.py: 1 warning
tests/test_varenergyspectrum.py: 1 warning
  UserWarning: table path was not set via the path= argument; using default path __astropy_table__

deadtime/tests/test_models.py::test_checkA[True]
deadtime/tests/test_models.py::test_checkA[False]
  UserWarning: All values for SymLogScale are below linthresh, making it effectively linear. You likely should lower the value of linthresh.

modeling/tests/test_parameterestimation.py::TestParameterEstimation::test_sampler_runs
  DeprecationWarning: The py23 module has been deprecated and will be removed in a future release. Please update your code.

modeling/tests/test_parameterestimation.py::TestPSDParEst::test_fit_method_works_with_correct_parameter
  RuntimeWarning: More than 20 figures have been opened. Figures created through the pyplot interface (`matplotlib.pyplot.figure`) are retained until explicitly closed and may consume too much memory. (To control this warning, see the rcParam `figure.max_open_warning`). Consider using `matplotlib.pyplot.close()`.

modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_crossspectrum
modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_powerspectrum
pulse/tests/test_accelsearch.py::TestAccelsearch::test_signal
pulse/tests/test_accelsearch.py::TestAccelsearch::test_signal_neg_fdot
pulse/tests/test_accelsearch.py::TestAccelsearch::test_noisy
pulse/tests/test_accelsearch.py::TestAccelsearch::test_noisy_neg_fdot
  RuntimeWarning: divide by zero encountered in log

modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_crossspectrum
modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_powerspectrum
  RuntimeWarning: divide by zero encountered in divide

modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_crossspectrum
modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_powerspectrum
  RuntimeWarning: invalid value encountered in subtract

pulse/tests/test_modeling.py::test_gaussian_bounds
  RuntimeWarning: underflow encountered in exp

pulse/tests/test_modeling.py::test_gaussian_bounds
  RuntimeWarning: underflow encountered in multiply

pulse/tests/test_search.py: 1 warning
simulator/tests/test_simulator.py: 39 warnings
tests/test_crosscorrelation.py: 1 warning
tests/test_crossspectrum.py: 2 warnings
tests/test_lightcurve.py: 1 warning
tests/test_lombscargle.py: 3 warnings
tests/test_multitaper.py: 1 warning
tests/test_varenergyspectrum.py: 3 warnings
  UserWarning: SIMON says: Stingray only uses poisson err_dist at the moment. All analysis in the light curve will assume Poisson errors. Sorry for the inconvenience.

simulator/tests/test_simulator.py: 2 warnings
tests/test_crosscorrelation.py: 1 warning
tests/test_crossspectrum.py: 75 warnings
tests/test_spectroscopy.py: 32 warnings
tests/test_varenergyspectrum.py: 1 warning
  UserWarning: n_ave is below 30. Please note that the error bars on the quantities derived from the cross spectrum are only reliable for a large number of averaged powers.

simulator/tests/test_simulator.py: 2 warnings
tests/test_crosscorrelation.py: 1 warning
tests/test_crossspectrum.py: 97 warnings
tests/test_lombscargle.py: 10 warnings
tests/test_spectroscopy.py: 46 warnings
tests/test_varenergyspectrum.py: 2 warnings
  RuntimeWarning: invalid value encountered in sqrt

tests/test_bexvar.py::TestInternalFunctions::test_lscg_gen
  UnitsWarning: 'ratio' did not parse as fits unit: At col 0, Unit 'ratio' not supported by the FITS standard.  If this is meant to be a custom unit, define it with 'u.def_unit'. To have it recognized inside a file reader or other code, enable it with 'u.add_enabled_units'. For details, see <https://docs.astropy.org/en/latest/units/combining_and_defining.html>

tests/test_crosscorrelation.py::TestCrossCorrelation::test_simple_plot
tests/test_crosscorrelation.py::TestCrossCorrelation::test_plot_axis
tests/test_crosscorrelation.py::TestCrossCorrelation::test_plot_title
  UserWarning: Matplotlib is currently using agg, which is a non-GUI backend, so cannot show the figure.

tests/test_crossspectrum.py: 12 warnings
  UserWarning: Some error bars in the Averaged Crossspectrum are invalid.Defaulting to sqrt(2 / M) in Leahy norm, rescaled to the appropriate norm.

tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_internal_from_events_works_acs
tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_from_events_works_acs
tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_rebin
tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_rebin_factor
tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_rebin_log
tests/test_crossspectrum.py::TestAveragedCrossspectrum::test_rebin
tests/test_crossspectrum.py::TestAveragedCrossspectrum::test_rebin_factor
tests/test_crossspectrum.py::TestAveragedCrossspectrum::test_rebin_log
  UserWarning: SIMON says: Number of segments used in averaging is significantly low. The result might not follow the expected statistical distributions.

tests/test_crossspectrum.py: 9 warnings
tests/test_powerspectrum.py: 4 warnings
  RuntimeWarning: invalid value encountered in divide

tests/test_multitaper.py::TestMultitaper::test_make_multitaper_from_lightcurve[lc_gauss]
tests/test_multitaper.py::TestMultitaper::test_multitaper_lombscargle
  UserWarning: SIMON says: Looks like your lightcurve statistic is not poisson.The errors in the Powerspectrum will be incorrect.

tests/test_powerspectrum.py::TestPowerspectrum::test_classical_significances_trial_correction
tests/test_powerspectrum.py::TestPowerspectrum::test_pvals_is_numpy_array
  RuntimeWarning: underflow encountered in _logp_multitrial_from_single_logp

-- Docs: <https://docs.pytest.org/en/stable/how-to/capture-warnings.html>
========================================================================================= short test summary info ==========================================================================================
FAILED test_io.py::TestIO::test_event_file_read_and_automatic_sort - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted. The list of emitted warnings is: [].
======================================================== 1 failed, 1627 passed, 21 skipped, 1 xfailed, 1 xpassed, 410 warnings in 143.24s (0:02:23) ========================================================
<ExitCode.TESTS_FAILED: 1>

```
taldcroft commented 4 weeks ago

@hamogu - sorry, this fell off my radar. Will do early next week.

matteobachetti commented 3 weeks ago

Thanks for the ongoing reviews! FYI, I've started addressing comments on readme/docs at https://github.com/StingraySoftware/stingray/pull/842

taldcroft commented 3 weeks ago

Package Review

Please check off boxes as applicable, and elaborate in comments below. Your review is not limited to these topics, as described in the reviewer guide

Documentation

The package includes all the following forms of documentation:

Readme file requirements The package meets the readme requirements below:

The README should include, from top to bottom:

NOTE: If the README has many more badges, you might want to consider using a table for badges: see this example. Such a table should be more wide than high. (Note that the a badge for pyOpenSci peer-review will be provided upon acceptance.)

Usability

Reviewers are encouraged to submit suggestions (or pull requests) that will improve the usability of the package as a whole. Package structure should follow general community best-practices. In general please consider whether:

Functionality

For packages also submitting to JOSS

Note: Be sure to check this carefully, as JOSS's submission requirements and scope differ from pyOpenSci's in terms of what types of packages are accepted.

The package contains a paper.md matching JOSS's requirements with:

Final approval (post-review)

Estimated hours spent reviewing: 5


Review Comments

I am reviewing this package as an (inactive) astronomer and active Astropy developer.

From a scientific perspective it appears that the stingray package fills a distinct need for modern open source Python-based X-ray spectral timing tools. It is being widely used in the X-ray timing community, with 170 citations to the 2019 ApJ paper. In recent years it is seeing around 35 citations per year.

From a software perspective this checks all the boxes (literally) for a package that meets current standards for a well-developed, documented, and maintained package. The test coverage is excellent.

I ran all the notebooks and found a couple of problems (documented as issues). This highlights the need for https://github.com/StingraySoftware/stingray/issues/802 to be worked, but overall things are looking OK and those notebook issues are not a blocker for accepting the package.

I opened one issue to document how to download the tutorial notebooks that form most of the package functional documentation. This is a simple docs update and I would like to see this done for accepting the package.

matteobachetti commented 3 weeks ago

Update:

matteobachetti commented 3 weeks ago

@masonng-astro I have a similar setup to yours (Macbook pro with M1 processor), but I couldn't reproduce your bugs 🆘

The failing test on unsorted data is particularly weird, as it does not seem to rely on any specific dependency other than the default ones. There is a vague chance that it is related to the numba installation, as I use numba to check for sortedness, but it should also revert to standard numpy if numba is not installed or it fails.

I could also install UltraNest without issues. I wonder if you have conflicting versions of some dependencies? From the list of libraries in your error, there seems to be some kind of clash between HEASOFT-installed libraries and the standard Python path. Is it possible?

taldcroft commented 3 weeks ago

FWIW as noted in my review, tests passed for me on Mac Silicon and Python 3.12. I created a fresh conda environment (using conda defaults channel) for Python 3.12 and then pip installed everything else.

taldcroft commented 2 weeks ago

With https://github.com/StingraySoftware/stingray/pull/842 now merged, I have updated my review to recommend approving this package!

hamogu commented 1 week ago

@masonng-astro Could you have a look if your concerns have been resolved or if you can provide a more detailed look at why it fails? It seems that @matteobachetti tried to reproduce your failure, but failed to do so. Of course, we don't want to accept a failing package, but it's also hard to figure out what went wrong. 😕

Once you think it's ready to be accepted, please edit the post with your review and tick the box on "I recommend approving this package." and leave. short message here (GH will inform me when you post a new comment, but not when you edit an existing one).

masonng-astro commented 5 days ago

@masonng-astro I have a similar setup to yours (Macbook pro with M1 processor), but I couldn't reproduce your bugs 🆘

The failing test on unsorted data is particularly weird, as it does not seem to rely on any specific dependency other than the default ones. There is a vague chance that it is related to the numba installation, as I use numba to check for sortedness, but it should also revert to standard numpy if numba is not installed or it fails.

I could also install UltraNest without issues. I wonder if you have conflicting versions of some dependencies? From the list of libraries in your error, there seems to be some kind of clash between HEASOFT-installed libraries and the standard Python path. Is it possible?

Sorry for my late response - I did eventually manage to hunt down the issue and it boiled down to a weird esoteric mix of architectures in my Anaconda installation which I never encountered until now! Apologies for holding up the review. I'll go ahead and approve now. I did check everything else and all is good on my end!

hamogu commented 4 days ago

🎉 Stingray has been approved by pyOpenSci! Thank you @matteobachetti for submitting stingray and many thanks to @taldcroft and @masonng-astro for reviewing this package! 😸

Author Wrap Up Tasks

There are a few things left to do to wrap up this submission:

It looks like you would like to submit this package to JOSS. Here are the next steps:

🎉 Congratulations! You are now published with both JOSS and pyOpenSci! 🎉

Editor Final Checks

Please complete the final steps to wrap up this review. Editor, please do the following:


If you have any feedback for us about the review process please feel free to share it here. We are always looking to improve our process and documentation in the peer-review-guide.

hamogu commented 4 days ago

@matteobachetti We at pyOpenSci would love to highlight stingray in a blog post similar to what these packages have done: pandera, movingpandas (We have an md template that you could use as a guide when creating your post.) but I know that you've written a lot about Stingray before, and this is certainly not a requirement!

I also want to invite all three of you to the PyOpenSci slack - there is a very active community that discusses best practices in packaging, testing , and administration of Python packages. Personally, I found a lot of those discussions very insightful and useful, thought I myself have never contributed by posting - I just use it as a resource to look at. If you'd like to join, let me know which email address I should invite!

matteobachetti commented 4 days ago

@hamogu I opened a PR with the badge, and filled in the form. Later on I will open the JOSS issue. Thanks a lot!

matteobachetti commented 4 days ago

@matteobachetti We at pyOpenSci would love to highlight stingray in a blog post similar to what these packages have done: pandera, movingpandas (We have an md template that you could use as a guide when creating your post.) but I know that you've written a lot about Stingray before, and this is certainly not a requirement!

I think it's a good idea, I have some matters to deal with in the next few weeks but I'll try to do it. Thanks!

I also want to invite all three of you to the PyOpenSci slack - there is a very active community that discusses best practices in packaging, testing , and administration of Python packages. Personally, I found a lot of those discussions very insightful and useful, thought I myself have never contributed by posting - I just use it as a resource to look at. If you'd like to join, let me know which email address I should invite!

Sent in private message!