Open goerz opened 3 years ago
I'm not sure what the project priorities are, but from a user perspective I totally agree. Having pre-built wheels available would be wonderful as helping less-advanced programmers get started using this package can be a challenge.
I would be willing to help put some work towards this as well if that would be a useful place to contribute.
This is something we're aware of, but unfortunately none of the maintainers have time to work on this. If anyone is willing to do this and submit a PR (maybe @dhill2522?) then that would be greatly appreciated.
I don't really know too much about how Python wheels work, or conda packages for that matter. More specifically, I'm not sure how binaries would work if the user wants to also install optional optimizers, which must be installed separately and linked via dynamic libraries.
Ok. We have had some experience with PyPi and binaries with project GEKKO and I think we should be able to figure something out. I will probably be relatively busy for the next week, but after that I should have a good chance to take a look at it. Certainly no guarantees I will have the time, but we will see.
As a note to myself and anyone working on this. If we were to release this as a conda package rather than a PyPi package then we might be able to take advantage of solver installations already published as conda packages. For example IPOPT is available relatively easily on Mac and Linux through cyipopt (Some tweaking likely required). I realize we would probably expect people to install the solvers themselves, but it seems that pyoptsparse has some pretty inherently system-level dependencies rather than just python ones. In my experience Conda has generally worked better for that.
Are there any strong feelings on a conda package vs a PyPi one?
on challenge is that pyoptsparse is often run on clusters, where the anaconda build is not always available. I don't object to a conda package though, as long as the manual install remains an option.
On Sat, Dec 5, 2020 at 5:23 PM Daniel Hill notifications@github.com wrote:
As a note to myself and anyone working on this. If we were to release this as a conda package rather than a PyPi package then we might be able to take advantage of solver installations already published as conda packages. For example IPOPT is available relatively easily on Mac and Linux through cyipopt https://github.com/matthias-k/cyipopt (Some tweaking likely required). I realize we would probably expect people to install the solvers themselves, but it seems that pyoptsparse has some pretty inherently system-level dependencies rather than just python ones. In my experience Conda has generally worked better for that.
Are there any strong feelings on a conda package vs a PyPi one?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/mdolab/pyoptsparse/issues/191#issuecomment-739424274, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAAN7OKOQ44YX52E5PMIBWTSTKXETANCNFSM4UOS4EJQ .
Good point. I'm guessing that people really looking to maximize performance would opt for a manual install anyway as it would allow for configuration of linear solver packages and things like that.
I certainly agree that a manual install should still be possible no matter how an installation package is made.
Being able to interface to cyipopt or others would be great, but will likely require some substantial changes to not just the installation process but the Python wrapper as well. This will probably be a broader issue tackled at a later stage.
I don't know too much about python packaging (with pip/conda), but if it's possible to provide say just the base package without any additional optimizers using Python wheels (i.e. pip) then that is probably a great starting point, since like @JustinSGray said not everyone uses conda but pretty much everyone uses pip.
This is a broad issue and I would prefer tackling it incrementally, rather than aiming for a larger goal (such as some sort of conda package with configurable linking with other optimizer libraries).
So I took a look at it and it looks pretty simple so far. I think setup.py
just needed to be converted from whatever it was (easy_install?) to the pip format. I used this tutorial for the actual release: https://betterscientificsoftware.github.io/python-for-hpc/tutorials/python-pypi-packaging/
I did this on the pip branch of my fork. It seemed to work fine and I was able to release to the test PyPI instance without an issue using the package name of "pyoptsparse-dhill2522". You can take a look at it the branch here and the test release here. The built package can also be installed with the following: pip install -i https://test.pypi.org/simple/ pyoptsparse-dhill2522
. I believe it will auto-link with any installed optimizers on install just like it did before, but I have not tested it yet.
There are a couple questions I had though:
setup.py
that I don't fully understand. Can anyone with more experience on this project take a look and see if it looks right?test
directory directly? I have not run any tests on the built library yet.Finally, I added some required metadata to setup.py and used my own info there in order to avoid impersonating anyone and because I don't know exactly what should go there. These fields will obviously need to be updated before actually doing any release.
Hi Daniel, thanks for putting in some work on this. However, I tried it out and the package available on PyPI (the test one) does not seem to contain any compiled libraries, so none of the optimizers work. The way the current setup.py
works is that it uses numpy.distutils
instead of setuptools
so that during installation, it calls the system Fortran compiler in order to compile the Fortran source code for the optimizers, then calls f2py
to link to Python. So, these libraries must be pre-compiled and provided as part of the distribution on PyPI. I am not entirely sure how to do this, and also guarantee that they will work with all systems that would install pyOptSparse.
The additional complication comes in through the optional optimizers as I mentioned before. These are detected by setup.py
at install time, and linked appropriately. In theory if we switched to a dynamic linking process, then it could be possible to just link to these additional optimizers instead of compiling everything locally, but again I am not sure. As you can see, this is somewhat involved which is why nobody has tackled this before. But it's certainly possible I think to provide wheels containing all the optimizers that are shipped within pyOptSparse, and if you need the optional linking with SNOPT etc, then you would have to install from source as you do now. That is probably a good first step I think.
Ok, looks like I completely missed the majority of the work then, but thanks for the direction. I have not had any experience with packages that require compilation of other languages, so it will likely take me a little while to figure it out.
I will see if I can make progress on this, but it is not pressing for my work, so it may go slowly.
No worries, this would be a major overhaul of the setup process for pyOptSparse so it will for sure take some time. Since this is more of a "nice to have" feature, there is no rush to get this working.
That being said, this is probably not as difficult as I made it out to seem. In fact, it's possible that everything is already there in setup.py
and a simple python setup.py bdist_wheel
is sufficient to generate the wheel, as long as we only want the default optimizers to be made available. I may give this a shot some time later this week (and feel free to tinker with this also). We will have to figure out how to make this flexible enough to dynamically link to the other libraries, though I am sure a solution exists out there.
I actually tried simply running python setup.py bdist_wheel
in a fresh environment first. It failed because numpy.distutils
was being called before setup
, so numpy was not installed in the fresh environment (and would not necessarily be available on a end user machine).
On second thought there is probably some sort of hook in setup
that allows running a script after the dependencies are worked out. If so, this could actually be fairly simple.
That's not really a problem for building the wheel since it won't be done by the user. It'll be run by the CI provider (currently Travis) on new releases in order to publish the package on PyPI, so as long as the environment there has the required dependencies to build the wheel we should be okay.
For those that install from source, there are ways around this also, using for example pyproject.toml
according to PEP-518 (see this for a quick overview). Python packaging is a bit of a mess at the moment I think, but we can do a better job there for sure. For example, we used to have a line here saying you need NumPy as a build dependency but somehow that was removed.
It seems that this package is a prerequisite for testing
dymos
(https://github.com/OpenMDAO/dymos/issues/471). Thus, it would be great ifpyoptsparse
was available on PyPI as well.The installations instructions for
pyoptsparse
(pip install .
) did not work out of the box for me, apparently due to compilation problems of the Fortran code:error.log
For packages with compiled code, I would strongly recommend uploading precompiled wheels to PyPI to make installation easy on systems that don't have the necessary compilers installed (Windows especially). I've had good experiences with using https://github.com/joerick/cibuildwheel running on Github actions to generate the wheels.
Releasing a conda-package on conda-forge may also be something to look into (but personally, I think wheels have a higher priority)