aleaxit / gmpy

General Multi-Precision arithmetic for Python 2.6+/3+ (GMP, MPIR, MPFR, MPC)
https://gmpy2.readthedocs.io/en/latest/
GNU Lesser General Public License v3.0
516 stars 86 forks source link

release? #146

Open videlec opened 7 years ago

videlec commented 7 years ago

Is there any plan for alpha, beta, stable release?

Vincent K. already prepared all the integration in pplpy and SageMath (see ticket 22927 and ticket 22928). We are just waiting an official release to move on.

casevh commented 7 years ago

I would still like to add mpc support to the C-API. I should be able to work on it this weekend. I should be able to make an alpha release by the end of June.

casevh commented 7 years ago

I'm working on the mpc API and I think I spotted a bug in GMPy_MPFR_New(). Can you add a Cython test that uses a precision of 0? A precision of 0 should cause gmpy2 to use the precision of the currently active context. I suspect it may crash in the current gmpy2 code.

Note that contexts have subtly, but significantly, changed in version 2.1. In gmpy2 2.0.x, there was only a single global context that was shared by all thread. Contexts are now thread specific.

videlec commented 7 years ago

Indeed, it does crash.

casevh commented 7 years ago

I've added mpc to C API. GMPy_MPFR_New() and GMPy_MPC_New() shouldn't crash if you specify a precision of 0.

Can you also add a test where mpz raises an exception? I think MPZ_Check() will crash if NULL is returned by PyObject_CallMethod().

vinklein commented 7 years ago

I add mpc to gmpy.pxd, cython tests for mpc, and tests with 0 precision for mpfr and mpc and all is working fine. Tests done where mpz raises exception doesn't cause crash.

isuruf commented 7 years ago

@casevh, can we have official manylinux1 and osx wheels? I've tried building manylinux1 wheels and works fine. https://github.com/isuruf/gmpy2-wheels/releases. OSX wheels doesn't work yet.

videlec commented 7 years ago

I should be able to make an alpha release by the end of June.

Do you have a schedule?

casevh commented 7 years ago

I believe all the critical issues for a first alpha have been resolved. I need to update the documentation and then I think I can make a source-only release soon. I will try for this weekend.

I will add a TODO file with the milestones for the following releases. At this point, I assume the next alpha release will follow soon.

casevh commented 7 years ago

can we have official manylinux1 and osx wheels? I've tried building manylinux1 wheels and works fine. https://github.com/isuruf/gmpy2-wheels/releases. OSX wheels doesn't work yet.

I think it would be great to include manylinux1 and osx wheels. May I use the wheels that you build? (Once the version number changes, etc.)

I have a couple of questions.

Are the manylinux1 wheels statically linked with gmp/mpfr/mpc?

Is gmp built with the --enable-fat option?

videlec commented 7 years ago

I need to update the documentation and then I think I can make a source-only release soon.

Do you want me to write some paragraph about Cython usage?

isuruf commented 7 years ago

May I use the wheels that you build? (Once the version number changes, etc.)

Sure. That'll be great

Are the manylinux1 wheels statically linked with gmp/mpfr/mpc?

No. these are linked dynamically since I wanted to keep the wheel LGPL. The dynamic libraries are copied into .libsgmpy2 folder with a unique name like libgmp-123456.so. (See https://github.com/pypa/auditwheel)

Is gmp built with the --enable-fat option?

Yes

OSX wheels are not working because delocate (python library used for renaming dynamic libs and vendoring for osx) doesn't work with C extension module with no pure python files. See https://github.com/matthew-brett/delocate/issues/22#issuecomment-313405672 I can make osx wheels statically link to gmp, mpfr, mpc, but we'll have to mention that the wheels are GPL while the source is LGPL.

casevh commented 7 years ago

Do you want me to write some paragraph about Cython usage? That would be great!

casevh commented 7 years ago

I can make osx wheels statically link to gmp, mpfr, mpc, but we'll have to mention that the wheels are GPL while the source is LGPL.

I always get confused with the licensing. I thought that linking multiple LGPL libraries together would result in another LGPL library (as long as the source code for everything was made available). What triggers the GPL requirement?

isuruf commented 7 years ago

I always get confused with the licensing.

Yes, me too. I guess linking LGPL statically with another LGPL library is not a problem. (I confused it with LGPL statically linking with another BSD license library). I'll update the code to link statically.

videlec commented 7 years ago

Note that it would be nice to have a link to the documentation https://gmpy2.readthedocs.io/en/latest/ on the PyPI page https://pypi.python.org/pypi/gmpy2

isuruf commented 7 years ago

Here are the new wheels with gmp, mpfr and mpc statically linked. Here's how the wheels are built.

videlec commented 7 years ago

I should be able to make an alpha release by the end of June.

Did you mean June 2017?

casevh commented 7 years ago

My apologies. I did mean June 2017 but I've been dealing with several real life emergencies. I am starting to experiment with twine and testpypi to get used to the new upload process.

@isuruf I made one minor code change to fix a warning with Python 2.7. There should be no practical difference but let me know if you rebuild the wheels.

One last comment on the wheels, and Cython, and inter-operability ... I saw a discussion on SageMath regarding the use of statically linked versus dynamically linked extensions. My only concern in static vs. dynamic is ensuring that Cython code (probably equivalent to saying all-of-Sage) is using the GMP, MPFR, and MPC libraries as gmpy2. At the moment, the C-API is disabled with a static build of gmpy2. I don't mind enabling the C-API for a static build. Please let me know if you want me to make the change.

(MPFR/MPC could be tricky because the precision is a global variable. The current version of gmpy2 expects the precision to be set to the maximum for the calculation and then rounds the result down to the desired precision. During the rounding process, the existing precision is saved and then restored. If this is different than Sage's usage of the MPFR library, I'll need to change gmpy2 to always set the precision, at least on a dynamically linked build. I probably should do that anyway. Note that these concerns aren't applicable to GMP. I've already disabled changing the memory manager functions in GMP.)

I really do hope to get a release out this weekend.

videlec commented 7 years ago

One last comment on the wheels, and Cython, and inter-operability ... I saw a discussion on SageMath regarding the use of statically linked versus dynamically linked extensions. My only concern in static vs. dynamic is ensuring that Cython code (probably equivalent to saying all-of-Sage) is using the GMP, MPFR, and MPC libraries as gmpy2. At the moment, the C-API is disabled with a static build of gmpy2. I don't mind enabling the C-API for a static build. Please let me know if you want me to make the change.

Make sense to me.

(MPFR/MPC could be tricky because the precision is a global variable. The current version of gmpy2 expects the precision to be set to the maximum for the calculation and then rounds the result down to the desired precision. During the rounding process, the existing precision is saved and then restored. If this is different than Sage's usage of the MPFR library, I'll need to change gmpy2 to always set the precision, at least on a dynamically linked build. I probably should do that anyway. Note that these concerns aren't applicable to GMP. I've already disabled changing the memory manager functions in GMP.)

In Sage, there is no global variable for precision. Each mpfr carries a given precision and you can only perform operations between mpfr elements with the same precision. If two elements are added and have different precisions, then the one with largest precision is truncated. There is no global variable.

I don't quite understand the usage of the global precision variable in gmpy2.

casevh commented 7 years ago

In Sage, there is no global variable for precision. Each mpfr carries a given precision and you can only perform operations between mpfr elements with the same precision. If two elements are added and have different precisions, then the one with largest precision is truncated. There is no global variable.

I don't quite understand the usage of the global precision variable in gmpy2.

My comment was slightly erroneous. The MPFR library (not gmpy2) has global settings:

The default precision is used by mpfr_init() when creating a new mpfr_t. gmpy2 uses mpfr_init2() which accepts the precision as an argument so the default precision setting is ignored.

On startup, the minimum and maximum exponent values are initialized to default values. Exceeding these limits during a calculation with result in +INF or -INF result. gmpy2 supports changing the exponent range to emulate other floating point formats, for example float32, float64, float128, etc. In gmpy2 2.0, I only changed the exponent range whenever the active context was changed or updated. Any other calls (say from Cython) into the MPFR library would inherit the gmpy2 exponent range. In gmpy2 2.1, I set the exponent range to the limits (which could be different from the default.) I then save, change, restore the exponent values when checking if the result fits in the desired exponent range (see mpfr_check_range() and mpfr_subnormalize().)

If Cython/Sage changes the exponent range but doesn't restore it, then gmpy2 will use the unexpected values. I should probably update gmpy2 to always reset the exponent range before performing any calculation; at least on dynamically linked builds.

And MPFR 4.0 will change the exponent limits....

videlec commented 7 years ago

Actually, Sage does modify the exponents on startup (line 341-343 of real_mpfr.pyx)

# On Sage startup, set the exponent range to the maximum allowed
mpfr_set_exp_min(mpfr_get_emin_min())
mpfr_set_exp_max(mpfr_get_emax_max())
casevh commented 7 years ago

I just managed to get successful Appveyor builds created. I should be able to get a release out within a few days.

videlec commented 7 years ago

how is it going? do you need help for something?

casevh commented 7 years ago

I have finally been able to upload the Windows wheels to testpypi. I had to make several changes to setup.py. The MacOSX and manylinux1 wheels will need to be rebuilt ( @isuruf Can you trigger a rebuild of those wheels? )

I have to a couple of other issues (defining long_description and uploading the source for gmp/mpir/mpfr/mpc). Once those are done, I can make a real release.

I would like to make a followup release fairly soon. There are a couple of incomplete feature that I want to remove from the 2.1 series and revisit for the 2.2 series.

@videlec I apologize for the delays. I have very little time to devote to gmpy2 at the moment. To avoid these delays in the future, would you like maintainer access for gmpy2 at pypi? I could also contact Alex Martelli for commit privileges for the main repository. You can contact me directly via the email address in setup.py.

isuruf commented 7 years ago

Here are the manylinux1 and osx wheels, https://github.com/isuruf/gmpy2-wheels/releases/tag/2f883a993e2d4. Btw, we can automate the windows builds as well with appveyor.

casevh commented 7 years ago

@isuruf Thanks for the manylinux1 and osx wheels. I am using Appveyor to make wheels for Windows.

I make another release attempt tomorrow.

isuruf commented 7 years ago

It looks like you are using dlls. They need to be shipped with the wheel. Also, GMP build is with MSVC and therefore not a fat binary. It's also generic c. What should be a reasonable default for x86 and x86_64

casevh commented 7 years ago

Thanks for catching this. I think the Windows build should default to static. I can make that change over the weekend,

For x64 CPU type, I think Core2 is probably the safest.

I've looked at using MSYS2 to compile for Windows. It works but isn't easy to setup. And there are possible issues with distributing the related binaries. It supports a fat build so it performs well. See https://github.com/emphasis87/libmpfr-msys2-mingw64 for a discussion of the licensing issues.

vinklein commented 7 years ago

@casevh Hi ! What are the remaining steps to complete the incoming release (missing wheels, issues) ? If i can help in a way, please tell me.

casevh commented 7 years ago

I am very sorry but real-life obligations have prevented my from finding time to work on a release. Unfortunately, I don't see any changes in the next few months.

Here is a summary of the current release status:

1) @isuruf has provided manylinux and OSX wheels. 2) I have made progress in building Windows wheels locally. There are many warnings that I need to research and verify they are not real bugs. 3) I don't know how to handle the LGPL provision that requires providing the source for any included or statically linked binary files. As I understand the requirements, the easiest (from the compliance perspective) approach is include the source within the binary wheels. But that would increase the file size dramatically. Another option is to provide two source file downloads - one just gmpy2 for distributions and for users compiling against local libraries, and second file with all the various sources. But I don't know how the PyPi / pip ecosystem supports this. (In the old PyPi infrastructure, I could just create another zip file and upload it.)

I'm stuck on the new release process. As a short-term solution, I've thought about just releasing the gmpy2 source and providing links to other sites that provide the wheels.

Here is a summary of the current code status:

1) I need to fix issue 156. 2) I need to research the Windows compiler warnings. 3) There are a couple of experimental features (support for quiet NaN and an alternate implementation for floor division for mpfr) in the 2.1 code base that are incomplete and will probably require redesign anyway. I would like to create a gmpy2-future branch with this code and then remove it from the main trunk. 4) More testing and documentation is required.

Summary

I need assistance with the entire release process. I just can't devote enough time to develop the process; especially on how to handle providing other source code.

I expect to be able to continue to merge pull requests and continue with testing an bug fixes.

Again, I am very sorry that I have not been able to manage a release. I greatly appreciate your contributions.

Case

casevh commented 7 years ago

Update: I have successfully compiled statically linked binaries for Windows. Next, I will be looking into the compiler warnings generated on Windows.

vinklein commented 7 years ago

That's great. As i have said in my last e-mail if you want i volunteer to look at these warnings if you send me your build configuration.

casevh commented 7 years ago

Just an update. I've spent the past two days trying to isolate what appears to be a race condition in the handling of contexts. The failures are different between pre-Python3.4 and Python 3.4 and later. Practically, it causes test failures with acos() and asin() with but I can't identify the cause.

casevh commented 6 years ago

I have incremented the version number to 2.1.0.a1. I have resolved all the Windows compiler warnings and removed two incomplete (and probably broken) features. I am able to create Windows binaries/wheels.

Please test this version. If no issues are reported, I will make a release this coming weekend.

vinklein commented 6 years ago

All tests (test/runtest.py) are ok with this version both on ubuntu with gmp and on winx64 with mpir. Tests done with python2.7 and python3.

casevh commented 6 years ago

@isuruf Could I trouble you to rebuild the wheels? I've fixed one reported bug and have bumped the version number. Thanks.

isuruf commented 6 years ago

Done. https://travis-ci.org/isuruf/gmpy2-wheels/builds/301256510

casevh commented 6 years ago

I have been trying to create a release and I have encountered an issue with setuptools and egg files. I encountered it a while ago and had to disable use of setuptools with Linux. I am trying to setuptools on Linux and it doesn't provide backwards compatibility.

When just using distutils (i.e. gmpy2 2.0.x) the resulting .so file is placed in <<..>>/site-packages/. A corresponding egg-info file is created. When using setuptools, an .egg directory is created and the .so is placed in that directory. A setuptools generated install does not delete the distutils .so file so the old file continues to be used.

I've found an email thread where the new behavior (put the .so in an .egg directory) is the intended behavior. Unfortunately, it also breaks seamless upgrades.

I'd like to revert back to the distutils behavior but I can't find any documentation on how to package C extensions with setuptools. I have found comments stating the lack of documentation is intentional and that Cython/CFFI/SWIG/whatever should be used instead.

Does anyone know how to restore the old distutils behavior?

Note: I haven't tested if installing a manylinux wheel over a distutils gmpy2 2.0.8 actually properly upgrades or not.

casevh commented 6 years ago

@videlec @vinklein @isuruf

I have made a release on PyPi and create a tag for 2.1.0a1. It's not perfect but I had to get something out there. I missed getting a readme uploaded and it is no longer possible to edit the description of a package that is uploaded.

vinklein commented 6 years ago

That's good news. @casevh On linux it seems that the git tag 2.1.0a1 it's not consistent with the pypi version :

Below the first error i obtain when doing cython test (with the last github version) with pypi 2.1.0a1 version.

/gmpy$ pip2 install gmpy2==2.1.0a1 --user
/gmpy$ python2 test_cython/runtests.py
()
Unit tests for gmpy2 2.1.0a1 with Cython 0.27.1
()
Compiling test_cython.pyx because it changed.
[1/1] Cythonizing test_cython.pyx

Error compiling Cython file:

    # Check that the refcount is correct
    assert Py_REFCNT(<PyObject *> x) == 1
    assert Py_REFCNT(<PyObject *> y) == 1

    mpz_set_si(x.z, 3)
               ^
test_cython.pyx:81:16: Cannot convert Python object to '__mpz_struct *'
videlec commented 6 years ago

For me it is much worse

$ pip2 install gmpy2==2.1.0a1 --user --upgrade
...
Successfully installed gmpy2-2.1.0a1
$ python2 test_cython/runtests.py 
()
Unit tests for gmpy2 2.1.0a1 with Cython 0.25.2
()
Compiling test_cython.pyx because it changed.
[1/1] Cythonizing test_cython.pyx
running build_ext
building 'test_cython' extension
creating build
creating build/temp.linux-x86_64-2.7
gcc -pthread -fno-strict-aliasing -march=x86-64 -mtune=generic -O2 -pipe -fstack-protector-strong -fno-plt -DNDEBUG -march=x86-64 -mtune=generic -O2 -pipe -fstack-protector-strong -fno-plt -fPIC -I/tmp/tmpSVlM5M -I/usr/lib/python27.zip -I/usr/lib/python2.7 -I/usr/lib/python2.7/plat-linux2 -I/usr/lib/python2.7/lib-tk -I/usr/lib/python2.7/lib-old -I/usr/lib/python2.7/lib-dynload -I/home/vincent/.local/lib/python2.7/site-packages -I/usr/lib/python2.7/site-packages -I/usr/lib/python2.7/site-packages/gtk-2.0 -I/usr/include/python2.7 -c test_cython.c -o build/temp.linux-x86_64-2.7/test_cython.o
gcc -pthread -shared -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now build/temp.linux-x86_64-2.7/test_cython.o -L/usr/lib -lgmp -lmpfr -lmpc -lpython2.7 -o /tmp/tmpSVlM5M/test_cython.so
Traceback (most recent call last):
  File "<string>", line 1, in <module>
AttributeError: 'module' object has no attribute '_C_API'
cython test failed
videlec commented 6 years ago

And python3

$ pip3 install gmpy2==2.1.0a1 --user --upgrade
...
Successfully installed gmpy2-2.1.0a1
$ python3 test_cython/runtests.py 
Unit tests for gmpy2 2.1.0a1 with Cython 0.27.3
Compiling test_cython.pyx because it changed.
[1/1] Cythonizing test_cython.pyx
running build_ext
building 'test_cython' extension
creating build
creating build/temp.linux-x86_64-3.6
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -march=x86-64 -mtune=generic -O2 -pipe -fstack-protector-strong -fno-plt -march=x86-64 -mtune=generic -O2 -pipe -fstack-protector-strong -fno-plt -march=x86-64 -mtune=generic -O2 -pipe -fstack-protector-strong -fno-plt -fPIC -I/home/vincent/.local/lib/python3.6/site-packages -I/tmp/tmp3qaes3av -I/usr/lib/python36.zip -I/usr/lib/python3.6 -I/usr/lib/python3.6/lib-dynload -I/usr/lib/python3.6/site-packages -I/usr/include/python3.6m -c test_cython.c -o build/temp.linux-x86_64-3.6/test_cython.o
gcc -pthread -shared -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now build/temp.linux-x86_64-3.6/test_cython.o -L/usr/lib -lgmp -lmpfr -lmpc -lpython3.6m -o /tmp/tmp3qaes3av/test_cython.cpython-36m-x86_64-linux-gnu.so
Traceback (most recent call last):
  File "<string>", line 1, in <module>
SystemError: initialization of test_cython raised unreported exception
cython test failed
videlec commented 6 years ago

Using the source package

$ pip install gmpy2==2.1.0a1 --user --upgrade --no-binary :all:

Cython tests pass with both Python 2 and 3.

casevh commented 6 years ago

At the moment, the statically linked builds don't support the C-API. I'm concerned that Cython code might be using a different library version. If a user is writing Cython code then they should have everything needed to compile from source. But pip always prefers wheels.

I can enable the C-API for static build but it might cause hard to find bugs. I can provide DLLs for Windows so Cython code would need to use those libraries.

I'm open to suggestions.

I can make a another alpha release soon.

On Nov 23, 2017 1:23 AM, "Vincent Delecroix" notifications@github.com wrote:

Using the source package

$ pip install gmpy2==2.1.0a1 --user --upgrade --no-binary :all:

Cython tests pass with both Python 2 and 3.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/aleaxit/gmpy/issues/146#issuecomment-346566075, or mute the thread https://github.com/notifications/unsubscribe-auth/ABkdb46QVwklssV5oHbHHdrk2eP_S9n-ks5s5TmJgaJpZM4OBKyR .

casevh commented 6 years ago

I'm trying to think of ways to safely interact between a statically link gmpy2 and Cython. Here are some ideas I've had.

How about adding/recommending code in Cython to verify GMP/MPFR/MPC versions and raise an error if they are different? But there could identical versions nad the shared library might be using a different memory allocator....

What about also adding a check of the GMP memory allocation functions? I can add a gmpy2 function that returns the allocation function pointers. If they agree, then I think we should be okay.

videlec commented 6 years ago

After discussion with Vincent Klein we propose to not distribute (and install) the Cython files with the wheels. This concerns the two files src/gmpy2.pxd and src/gmpy2.h.

jdemeyer commented 6 years ago

I have been trying to create a release and I have encountered an issue with setuptools and egg files.

You should avoid eggs for other reasons too: it will make the files gmp2.pxd and gmpy2.h inaccessible.

The solution is to use pip to install your package instead of setup.py install

jdemeyer commented 6 years ago

After discussion with Vincent Klein we propose to not distribute (and install) the Cython files with the wheels.

Why?

videlec commented 6 years ago

After discussion with Vincent Klein we propose to not distribute (and install) the Cython files with the wheels. Why?

Because with wheels:

Any constructive solution would of course be better but that could be postponed for the next release.

jdemeyer commented 6 years ago

I'm trying to think of ways to safely interact between a statically link gmpy2 and Cython.

I don't think that this is possible. If the same library (GMP) is used in different places (gmpy2 internally and some other Cython package), the only option is dynamic linking. But why would you use static linking anyway? That is a bad idea for various reasons.