SAIL-Labs / AMICAL

Extraction pipeline and analysis tools for Aperture Masking Interferometry mode of the last generation of instruments (ground-based and space).
https://sail-labs.github.io/AMICAL/
MIT License
9 stars 7 forks source link

BUG: fix calling `select_clean_data` with default `isz` #201

Closed tomasstolker closed 11 months ago

tomasstolker commented 11 months ago

This PR implements a bit more consistent use of the isz parameter when the argument is set to None.

Currently, the value could be set to None in select_clean_data, in which case the original image size is used without searching for a new image center.

The show_clean_params did however still search for a new image center with isz=None. I have changed the way this is implemented, so the original image center is used in that case.

I have also added/changed the default value of isz in show_clean_params and select_clean_data to None. Before, the default value of isz was only set to None in clean_data.

tomasstolker commented 11 months ago

Sure! I have added a test for isz=None in the last commit.

The test_clean_sky_out_crop test in test_processing causes this warning (but fails the test) on my machine:

RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility. Expected 80 from C header, got 96 from PyObject

Might be something specific with my installation (Python 3.11 on mac) although on Github it seems to run fine.

neutrinoceros commented 11 months ago

The test_clean_sky_out_crop test in test_processing causes this warning (but fails the test) on my machine:

RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility. Expected 80 from C header, got 96 from PyObject

Warnings are treated as errors so we don't let deprecation warnings slip through AMICAL :)

Now, my guess is that you got a mix of pip-installed and conda-installed packages in there. For instance if you got numpy from conda and astropy from pypi, ABI compatibility is not guaranteed and these kind of warnings can pop up at any point. If you use conda I'd suggest sticking to it as strictly as possible: some packages like AMICAL may not be installable via conda (though it could be something we support ! if you're interested in that please open a new issue to get the conversation going), but AMICAL is pure Python (no C extensions), so it's not a problem in itself.

tomasstolker commented 11 months ago

The test_clean_sky_out_crop test in test_processing causes this warning (but fails the test) on my machine: RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility. Expected 80 from C header, got 96 from PyObject

Warnings are treated as errors so we don't let deprecation warnings slip through AMICAL :)

Now, my guess is that you got a mix of pip-installed and conda-installed packages in there. For instance if you got numpy from conda and astropy from pypi, ABI compatibility is not guaranteed and these kind of warnings can pop up at any point. If you use conda I'd suggest sticking to it as strictly as possible: some packages like AMICAL may not be installable via conda (though it could be something we support ! if you're interested in that please open a new issue to get the conversation going), but AMICAL is pure Python (no C extensions), so it's not a problem in itself.

I am only using pip, with a virtualenv for AMICAL, so no conda involved... I installed as pip install . in the AMICAL folder.

neutrinoceros commented 11 months ago

ah, then there's something fishy going on. These warnings may also occur when importing some C extension that was compiled against a newer version of numpy that the one you have installed. Astropy itself systematically uses the oldest version of numpy possible at compile time to prevent this, as do a lot of packages in the scientific Python ecosystem. In fact I do not know off-hand of any package that would not be doing it properly, so it's hard to know exactly what's the problem without digging into your env ! If you can afford it, I'd suggest upgrading numpy as python -m pip install --upgrade numpy, which should make the problem go away 🤞🏻

tomasstolker commented 11 months ago

Updating numpy to 1.26.2 did not solve the issue, but creating a new virtualenv and pip install . worked indeed!

tomasstolker commented 11 months ago

I think that might be because the images in test.fits are already centered so isz_max is equal to dim such that the ValueError does not happen. In a real dataset that will not be the case so isz_max will be larger smaller than dim when isz=None. In such a case, setting isz=None with the implementation of the main branch will cause a ValueError because crop_max is executed whereas it should not.

neutrinoceros commented 11 months ago

If you have such a real dataset and it's small enough to be added to the repo (a couple MB at most), it'd be very useful to use that instead

tomasstolker commented 11 months ago

A typical dataset is much larger than a few MB. The simplest solution might be to use the images from test.fits, and pad zeros and crop around a different center, such that the brightest pixel has an offset with respect to the image center? I will leave that for a future PR.

neutrinoceros commented 11 months ago

This sounds like a reasonable approach. Thanks!

DrSoulain commented 11 months ago

Many thanks, @tomasstolker and @neutrinoceros; indeed, it's more consistent now 🙂. I agree, too, with implementing a new test to capture the centering issue that an inappropriate isz value might cause.