Closed EiffL closed 2 years ago
Documentation for pytest hooks is here: https://docs.pytest.org/en/7.1.x/how-to/writing_hook_functions.html
This should probably let us decide which tests to run.
Ok, so I got started on an approach to do this in branch autotest
. Basically my idea is the following:
jax_galsim
instead of galsim
assert
failures that are caused by NotImplementedError or similar, so that all the tests that can run do run, even if there is one line that can't be executed because of a missing feature.Here is my pytest magic, in a conftest.py:
import os
import pytest
# Identify the path to this current file
test_directory = os.path.dirname(os.path.abspath(__file__))
# Loading which tests to run
with open(os.path.join(test_directory, "enabled_tests.txt"), "r") as f:
enabled_tests = f.read().split("\n")
def pytest_collection_modifyitems(config, items):
skip = pytest.mark.skip(reason="Skipping this because ...")
for item in items:
if not any([t in item.nodeid for t in enabled_tests]):
item.add_marker(skip)
if my enabled_tests.txt
only contains say test_gaussian.py
, then the pytest output will look like:
ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 18%]
ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 36%]
sssssssssssssss......sssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 55%]
ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 73%]
ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss [ 92%]
sssssssssssssssssssssssssssssssssssssssssssssssssss [100%]
===================================================== short test summary info ======================================================
SKIPPED [5] tests/GalSim/tests/test_airy.py: Skipping this because ...
SKIPPED [8] tests/GalSim/tests/test_bandpass.py: Skipping this because ...
SKIPPED [11] tests/GalSim/tests/test_bessel.py: Skipping this because ...
SKIPPED [4] tests/GalSim/tests/test_box.py: Skipping this because ...
SKIPPED [3] tests/GalSim/tests/test_calc.py: Skipping this because ...
....
6 passed, 665 skipped in 1.03s
but I haven't found a way yet to force a test to use jax-galsim instead of galsim....
ok, I think #14 provides a good implementation.
this was solved by #14
Our target is to be able to automatically run the GalSim test suite on JAX-GalSim without having to copy code or reference data. We will also want to have a bunch of separate checking for instance that objects are properly jitted and that gradients are correctly calculated.
This issue is to brainstorm possible solutions to this problem, here are a few thoughts to begin with:
For features that are implemented in JAX-GalSim, the tests should require no modification at all, except adding:
in the import section of the test.
It may be possible to detect test failures due to NotImplementedError, and allow them to fail. The test suite would run all the tests that it can
It may be possible to maintain a list of tests that are currently expected to fail. And progressively, over time, we would reduce that list as more features become available.
Now... how to do these things concretely in practice, requires a bit of thinking.