MillionConcepts / lhorizon

lhorizon helps you find things in the Solar System.
BSD 3-Clause "New" or "Revised" License
5 stars 2 forks source link

do tests verify results against external sources (e.g., an Horizons dump)? #24

Closed steo85it closed 3 years ago

steo85it commented 3 years ago

As from the title, are the (very extensive and working well) tests provided with the code only checking internal functionalities and consistency, or are they also verifying that results by the package are consistent with output by the Horizons platform (or by spiceypy or astropy or whatever else)?

If yes, I would suggest to:

If not, then:

https://github.com/openjournals/joss-reviews/issues/3495

m-stclair commented 3 years ago

Many of the tests check against saved csv dumps of output from Horizons. However, of course, all of the inputs to the package are also basically live dumps of similar output from Horizons, so this is less reassuring than it could be. To guard against the possibility that either the tests or the package as a whole are parsing Horizons output in a pervasively but subtly poor way that is introducing a systematic numerical error, I also have a few black-box "reasonableness" checks, e.g.:

Do you think I should have more of these? I have certainly considered adding calls to other data sources. Other purely analytical tests could be added, like checking to see if the position lhorizon reports for a particular planet is within the expected margin of error from its approximate position as computed using Keplerian formulae.

In either case, I can add more documentation to lhorizon.tests.

m-stclair commented 3 years ago

added some explanatory docstrings and comments to lhorizon.tests submodules: 30a9140

leaving this issue open to get your opinion on the broader verification question.

steo85it commented 3 years ago

Thanks for the detailed explanation: I think the tests are more than adequate and that should now be clear from the docstrings.