Closed leila-pujal closed 3 months ago
After some tries I converged to a setup that works for python versions 3.9, 3.10, and 3.11. First I only updated the python versions on Github actions to be >= 3.9 and checking from 3.9 to 3.12 (commit 85ae589). At this point I was not specifying any version for numpy and scipy and all python versions failed on Windows with what look like was a problem with scipy eval_hermite
function (https://github.com/theochem/gbasis/actions/runs/9666120325/job/26664879878). Then I saw in IOData's pytest workflow for python 3.9 that the oldest possible versions of numpy(1.22) and scipy(1.11) are installed so I tried to specify them on the workflow and with that ubuntu python 3.12 was giving strange errors but the other were runing fine so I excluded it. The last change I needed to do was to upgrade numpy from 1.22 to 1.23 because python 3.11 on ubuntu was giving some errors and upgrading numpy locally worked. I am writing this to keep track. I planning to merge this shortly but feel free to add any comments that you may find useful or relevant, @FarnazH, @PaulWAyers, @tovrstra.
I quickly checked the error you are getting. It is probably the following:
>>> from scipy.special import eval_hermite
>>> eval_hermite(2.0, 3.5)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: ufunc 'eval_hermite' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
>>> eval_hermite(2, 3.5)
47.0
The first argument (order) must be of integer type. Reasonable, I guess, but it will require some fixes elsewhere in GBasis, where the array is created or modified, to make sure it is integer.
Fixing the versions of dependencies can be useful to check that the tests also pass with older versions of the library. In IOData, this is only done when testing with the oldest version of Python we test on. (The old versions of dependencies will not always be available for the newest Python version.)
Hi @tovrstra, thank you so much for checking that error. Is that prompt from python in a Windows machine? I was kind of confused because the error only showed up for Windows tests and not for ubuntu. I just checked python 3.11 and 3.12 locally in my ubuntu and with scipy=1.14 and numpy=2.0.0 (including the changes done by @msricher in PR #188 ) all the tests run.
Not really. My most recent experience with Windows dates back to the previous century :)
This was on tested on Fedora Linux. Here is a more complete prompt with versions:
Python 3.12.3 (main, Apr 17 2024, 00:00:00) [GCC 14.0.1 20240411 (Red Hat 14.0.1-0)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from scipy.special import eval_hermite
>>> eval_hermite(2.0, 3.5)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: ufunc 'eval_hermite' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
>>> eval_hermite(2,3.5)
47.0
>>> import scipy
>>> scipy.__version__
'1.11.3'
>>> import numpy
>>> numpy.__version__
'1.26.4'
The GitHub actions failed on Windows first, after which all the other ones were aborted, but that does not mean they would have passed.
Same test with more recent SciPy and NumPy:
Python 3.12.4 | packaged by conda-forge | (main, Jun 17 2024, 10:23:07) [GCC 12.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from scipy.special import eval_hermite
>>> eval_hermite(2.0, 3.5)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: ufunc 'eval_hermite' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
>>> eval_hermite(2,3.5)
np.float64(47.0)
>>> import scipy
>>> scipy.__version__
'1.14.0'
>>> import numpy
>>> numpy.__version__
'2.0.0'
Can you check that your pytest
is the intended one? When you run which pytest
, does it point to the one in your virtual env? If it is /usr/bin/pytest
, you'll be using the NumPy and SciPy from Ubuntu, which will be a bit older.
Hi @tovrstra, sorry for the late answer and thanks for your follow up. I had a vague memory you don't mingle with Windows but I just wanted to double check :). I checked what pytest I am using and it looks it takes the one from the enviroment:
(base) leila@ulysses:~$ conda activate gbasis_py312
(gbasis_py312) leila@ulysses:~$ which pytest
/home/leila/anaconda3/envs/gbasis_py312/bin/pytest
(gbasis_py312) leila@ulysses:~$
I tried your code using the python prompt and in that case I got the expected error:
Python 3.12.4 | packaged by conda-forge | (main, Jun 17 2024, 10:23:07) [GCC 12.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from scipy.special import eval_hermite
>>> eval_hermite(2.0, 3.5)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: ufunc 'eval_hermite' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
>>> eval_hermite(2,3.5)
np.float64(47.0)
>>> import scipy
>>> scipy.__version__
'1.14.0'
>>> import numpy
>>> numpy.__version__
'2.0.0'
>>>
But then my test pass fine and I am not sure why. I added a print of numpy and scipy versions in one of the tests and it printed the ones I posted above. By any chance did you run the test too? If so did they fail in your fedora using those versions? I was checking the other runs for ubuntu on pytest to see if the ones that use eval_hermite
fail (for test_density.py the ones that use derivatives) and if I checked correctly they passed the tests before the run was aborted (https://github.com/theochem/gbasis/actions/runs/9666120325/job/26664878634). @msricher, could you try to run the gbasis tests with python 3.12 and those versions of numpy and scipy and see if they fail/pass? it could be that maybe I am actually using my base enviroment packages and I am not able to check it properly.
@leila-pujal No worries about the delay. I took a closer look and my impression is that the cause is already fixed in the main branch, so the error no longer appears.
If you want to bring the fix into the current PR, you can merge the main branch into this development branch and rerun the fixes. (This kind of opposite merging can be done by running git merge main
on your local repository after you have switched to the fix_actions
branch.) Then push the merged branch to your fork on GitHub and tests will be executed again. This procedure requires no force-push and allows you to continue working on this PR with improvements from main
.
@tovrstra, thanks again for your answer and for taking the time to look at this. I merged the main into the PR and tried again the newest versions for dependencies (scipy==1.14 and numpy==2.0.0) but I still got tests failing on Windows but passing on Linux . I tried scipy issues on github and found this opened only a week ago https://github.com/scipy/scipy/issues/21052 . Then I downgraded numpy version and then tests passed on Windows. What do you think about this? Before checking this issue I was thinking of changing the floats to integers when using eval_hermite
but it looks like maybe is on scipy side to fix it?
Good that you noticed the SciPy issue! I'd suggest adding version constraints in the pyproject.toml
, so it is also correctly handled when gbasis is installed with pip.
See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#dependencies-and-requirements for how this can be done per operating system.
After updating pyproject.toml
you can remove the line pip install numpy==1.26.0 scipy==1.14.0
from the step name: Install new dependencies for testing
in pytest.yaml
. This way, the tests validate that gbasis works on the newest version of dependencies when they appear (and when they are supported). For macos and linux, there is no need to set an upper limit, if I understand correctly. (To test with older versions, you still need to manually intall them first, so that's ok.)
A few other suggestions:
pip install pytest pytest-md pytest-emoji pytest-cov
from the workflow and include them in the [dev] dependencies, or create [actions] dependencies for them. See e.g. https://github.com/theochem/iodata/blob/main/pyproject.toml#L42pyproject.toml
, which you can also add. Then you can remove the lines pip install --user git+https://github.com/theochem/iodata.git@main
from the pytest workflow. It is ok to just add the qc-iodata
dependency from PyPI, which will install a fairly recent pre-release.Thanks @tovrstra for your detailed answer. I think I included all the changes you suggested and now all pytests pass. I will try to figure out why pre-commit stoped working but if I don't I'll just merge this PR and check in more detail later so new PR can have proper pytests.
I just added some nitty-gritty comments.
It is probably better to defer the pre-commit issue to another pull request. I quickly tried to run pre-commit run --all
and it cleans up a lot of stuff. That's more suited for a separate PR.
We haven't activated pre-commit.ci yet for this repo, so it is not checked yet in PRs. If you like, I can already switch it on.
I created an issue to discuss pre-commit. This way, you can proceed with the current PR and merge it sooner.
I just pushed a commit with all your suggestions. I will merge this one today so at least pytest is set up properly. About the pi-commit bot, thanks for the information! It was really strange because it worked before and all of a sudden it didn't so it was difficult for me to understand what happened. I am learning on the go and I understand what pi-commit bot does but sometimes setting up these continuous integration tools in the remote repo is not as easy as to use them in the command line locally. Anyway, we can talk more about this in the issue you opened. Thanks for all your help!
@FarnazH I agree that this is a bit difficult. Once Matt has merged his HORTON2 update, it should not be too hard to upgrade it to more recent Python 3 versions. My reason for not supporting 3.8 in IOData is that in a few months Python 3.8 is end of life. See https://devguide.python.org/versions/
This PR addresses the latest updates in IOData repository for finishing support for Python versions < 3.9.
Checklist
Add tests for each unit of code added (e.g. function, class)Update documentationSquash commits that can be grouped togetherType of Changes
Related