simpeg / aurora

software for processing natural source electromagnetic data
MIT License
13 stars 2 forks source link

test_doc_build.py suddenly failing on python 3.9+ but not on 3.8 #283

Closed kkappler closed 10 months ago

kkappler commented 1 year ago

This is happening on the current dev branch fourier_coefficients First observed on Wednesday Aug 29, 2023.

Python 3.9 tests fail with the following summary, but 3.8 is passing.

=========================== short test summary info ============================ FAILED tests/test_doc_build.py::Doc_Test::test_html - AssertionError: False is not true =========== 1 failed, 49 passed, 26206 warnings in 672.77s (0:11:12) ===========

@laura-iris can you take a look at this?

laura-iris commented 1 year ago

I'll spend some time looking into this and let you know what I find.

laura-iris commented 12 months ago

The lead I've been going down is related to the Spinx version. Sphinx has had a lot of updates in the past month, including a jump for v7.1.2 to v7.2.x, and during that jump it abandoned support for python3.8. This means that the tests that we run use different versions of Sphinx, with the 3.8 test using v7.1.2, while the rest use v7.2.5.

Locally when I run the tests, my pytest is using v7.2.5 and fails to run the 3.8 test with what appears to be the same errors that we see in the GitHub Run Tests output.

The fix? Not sure yet.

jcapriot commented 12 months ago

Took a preliminary look through the error messages myself... My guess is it is related to a failing notebook example: https://github.com/simpeg/aurora/actions/runs/6116597192/job/16602042600#step:6:89

The error says

Notebook error:
NoSuchKernel in examples/placeholder.ipynb:
No such kernel named python3

Not that I know how to resolve that, but it would be where I would start looking.

kkappler commented 12 months ago

Interestingly, there was, as one time a notebook called placeholder.ipynb in aurora/docs/examples, but it is no longer present.

With a fresh checkout of fix_issue_283 branch, I call from the top level: find . | grep ipynb and get:

./tutorials/processing_configuration.ipynb ./tutorials/synthetic_data_processing.ipynb ./tutorials/pole_zero_fitting/lemi_pole_zero_fitting_example.ipynb ./docs/examples/make_cas04_single_station_h5.ipynb ./docs/examples/dataset_definition.ipynb ./docs/examples/deprecated/create_mth5_for_parkfield_with_rover.ipynb ./docs/examples/operate_aurora.ipynb

I wonder where the reference to this deleted notebook is coming from

jcapriot commented 12 months ago

It is likely created during the sphinx-gallery translation of https://github.com/simpeg/aurora/blob/fourier_coefficients/tutorials/placeholder.py

kkappler commented 12 months ago

Thanks @jcapriot, and @laura-iris, deletion of the stale placeholder file does allow the test to pass.

This is probably OK for now, but it would be good to actually have a placeholder.py that does work, so that we could follow that template if we want to add a py file in tutorials folder.

@laura-iris We can discuss/prioritize this afternoon. It might be better to reduce the ~200 lines of warnings in test_doc_build results before digging into placeholder.py.

I know that there are ipynbs in the repo that reference kernels that are not present during testing (where I beleive the environment is called aurora-test and it maybe better to focus for example on issue #45 first and get all the ipynb updated to work in that environment before tackling the rest of this issue.

FWIW I paste the placeholder.py contents below:

"""
Placeholder for example
===========================

This example is a placeholder that uses the sphinx-gallery syntax
for creating examples
"""

import aurora
import numpy as np
import matplotlib.pyplot as plt

###############################################################################
# Step 1
# ------
# Some description of what we are doing in step one

x = np.linspace(0, 4 * np.pi, 100)

# take the sin(x)
y = np.sin(x)

###############################################################################
# Step 2
# ------
# Plot it

fig, ax = plt.subplots(1, 1)
ax.plot(x, y)
laura-iris commented 12 months ago

By adding a file called requirements.txt under docs/ I was able to get the tests to pass on all python versions:

cat docs/requirements.txt
ipykernel

https://github.com/simpeg/aurora/actions/runs/6124926152/job/16625895596

laura-iris commented 12 months ago

Or maybe my work just overlapped with yours Karl

kkappler commented 12 months ago

Yes, I had deleted the placeholder.py file. I just added it back in, and the tests do not pass. I'll leave this branch alone so you can experiment with the requirements:)

laura-iris commented 12 months ago

Shoot, it looks like it was just overlapping with your work: https://github.com/simpeg/aurora/actions/runs/6125052883/job/16626254946

laura-iris commented 11 months ago

I worked on clearing out a bunch of the auto-documentation warnings, so that could be quieter in the logs - I think that there is still work to be done to make sure that the documentation is complete, but the format has been shifted so that sphinx no longer complains.

After I did that, I removed the bandaid and changed nbsphinx_allow_errors back to False in conf.py, and when the test ran they all still passed. Looking at it, sphinx had an update around 10 days ago, and the 3.9+ tests use the latest (7.2.6) - I suspect this has fixed the error associated with template.py.

kkappler commented 11 months ago

Thanks @laura-iris, did you push?

Depending on where the changes I'd like to merge them into fourier_coefficients branch, but I already did a merge with main in anticipation of tagging later this week. If you push your updates I'll take a look at merging them.

laura-iris commented 11 months ago

Yeah, all of the changes were pushed to https://github.com/simpeg/aurora/tree/fix_issue_283 as I was iteratively fixing warnings, running tests, fixing warnings, etc.

kkappler commented 10 months ago

Thanks @laura-iris ! The bug is gone and for what it’s worth the output in the gthub actions for test_doc_build is now ~50% the size. I count 132 readable lines, down from 247 garbled ones – see attached .txt files for the record.

test_doc_build_pre_issue_283.txt test_doc_build_post_issue_283.txt