ContinuumIO / anaconda-issues

Anaconda issue tracking
648 stars 222 forks source link

compiler_compat/ld does not work with non-ancient gcc #11152

Closed rgommers closed 4 years ago

rgommers commented 5 years ago

Actual Behavior

On Linux installing the Anaconda distribution for Python 3.7 via the regular installer installs compiler_compat/ld. That breaks building Python extensions with recent/normal versions of gcc, and has done so for a long time. To reproduce:

$ conda --version
conda 4.7.10
$ gcc --version
gcc (GCC) 9.1.0
$ conda create -n dummy
$ conda activate dummy
$ conda install cython
$ git clone https://github.com/numpy/numpy.git
$ cd numpy
$ python setup.py build_ext -i
...
RuntimeError: Broken toolchain: cannot link a simple C program
$ ls ~/anaconda3/envs/dummy/compiler_compat/
ld  README
$ rm ~/anaconda3/envs/dummy/compiler_compat/ld
$ python setup.py build_ext -i   # works now

Full build failure:

``` $ python setup.py build_ext -i Running from numpy source directory. Cythonizing sources numpy/random/bounded_integers.pxd.in has not changed numpy/random/generator.pyx has not changed numpy/random/sfc64.pyx has not changed numpy/random/mtrand.pyx has not changed numpy/random/common.pyx has not changed numpy/random/pcg64.pyx has not changed numpy/random/bit_generator.pyx has not changed numpy/random/entropy.pyx has not changed numpy/random/philox.pyx has not changed numpy/random/mt19937.pyx has not changed numpy/random/bounded_integers.pyx.in has not changed numpy/random/bounded_integers.pyx has not changed blas_opt_info: blas_mkl_info: customize UnixCCompiler libraries mkl_rt not found in ['/home/rgommers/anaconda3/envs/dummy/lib', '/usr/local/lib', '/usr/lib64', '/usr/lib', '/usr/lib/'] NOT AVAILABLE blis_info: customize UnixCCompiler libraries blis not found in ['/home/rgommers/anaconda3/envs/dummy/lib', '/usr/local/lib', '/usr/lib64', '/usr/lib', '/usr/lib/'] NOT AVAILABLE openblas_info: customize UnixCCompiler customize UnixCCompiler customize UnixCCompiler FOUND: libraries = ['openblas', 'openblas'] library_dirs = ['/usr/lib64'] language = c define_macros = [('HAVE_CBLAS', None)] FOUND: libraries = ['openblas', 'openblas'] library_dirs = ['/usr/lib64'] language = c define_macros = [('HAVE_CBLAS', None)] non-existing path in 'numpy/distutils': 'site.cfg' lapack_opt_info: lapack_mkl_info: customize UnixCCompiler libraries mkl_rt not found in ['/home/rgommers/anaconda3/envs/dummy/lib', '/usr/local/lib', '/usr/lib64', '/usr/lib', '/usr/lib/'] NOT AVAILABLE openblas_lapack_info: customize UnixCCompiler customize UnixCCompiler customize UnixCCompiler C compiler: gcc -pthread -B /home/rgommers/anaconda3/envs/dummy/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC creating /tmp/tmpvtn79wil/tmp creating /tmp/tmpvtn79wil/tmp/tmpvtn79wil compile options: '-c' gcc: /tmp/tmpvtn79wil/source.c /tmp/tmpvtn79wil/source.c:1:1: warning: function declaration isn’t a prototype [-Wstrict-prototypes] 1 | void zungqr_(); | ^~~~ gcc -pthread -B /home/rgommers/anaconda3/envs/dummy/compiler_compat -Wl,--sysroot=/ /tmp/tmpvtn79wil/tmp/tmpvtn79wil/source.o -L/usr/lib64 -lopenblas -o /tmp/tmpvtn79wil/a.out /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: /tmp/tmpvtn79wil/tmp/tmpvtn79wil/source.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: /tmp/tmpvtn79wil/tmp/tmpvtn79wil/source.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: /tmp/tmpvtn79wil/tmp/tmpvtn79wil/source.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: /tmp/tmpvtn79wil/tmp/tmpvtn79wil/source.o: unable to initialize decompress status for section .debug_info /tmp/tmpvtn79wil/tmp/tmpvtn79wil/source.o: file not recognized: file format not recognized collect2: error: ld returned 1 exit status NOT AVAILABLE openblas_clapack_info: customize UnixCCompiler customize UnixCCompiler customize UnixCCompiler C compiler: gcc -pthread -B /home/rgommers/anaconda3/envs/dummy/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC creating /tmp/tmpevi68f5r/tmp creating /tmp/tmpevi68f5r/tmp/tmpevi68f5r compile options: '-c' gcc: /tmp/tmpevi68f5r/source.c /tmp/tmpevi68f5r/source.c:1:1: warning: function declaration isn’t a prototype [-Wstrict-prototypes] 1 | void zungqr_(); | ^~~~ gcc -pthread -B /home/rgommers/anaconda3/envs/dummy/compiler_compat -Wl,--sysroot=/ /tmp/tmpevi68f5r/tmp/tmpevi68f5r/source.o -L/usr/lib64 -lopenblas -llapack -o /tmp/tmpevi68f5r/a.out /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: /tmp/tmpevi68f5r/tmp/tmpevi68f5r/source.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: /tmp/tmpevi68f5r/tmp/tmpevi68f5r/source.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: /tmp/tmpevi68f5r/tmp/tmpevi68f5r/source.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: /tmp/tmpevi68f5r/tmp/tmpevi68f5r/source.o: unable to initialize decompress status for section .debug_info /tmp/tmpevi68f5r/tmp/tmpevi68f5r/source.o: file not recognized: file format not recognized collect2: error: ld returned 1 exit status NOT AVAILABLE flame_info: customize UnixCCompiler libraries flame not found in ['/home/rgommers/anaconda3/envs/dummy/lib', '/usr/local/lib', '/usr/lib64', '/usr/lib', '/usr/lib/'] NOT AVAILABLE atlas_3_10_threads_info: Setting PTATLAS=ATLAS customize UnixCCompiler libraries lapack_atlas not found in /home/rgommers/anaconda3/envs/dummy/lib customize UnixCCompiler libraries tatlas,tatlas not found in /home/rgommers/anaconda3/envs/dummy/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/local/lib customize UnixCCompiler libraries tatlas,tatlas not found in /usr/local/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/lib64 customize UnixCCompiler libraries tatlas,tatlas not found in /usr/lib64 customize UnixCCompiler libraries lapack_atlas not found in /usr/lib customize UnixCCompiler libraries tatlas,tatlas not found in /usr/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/lib/ customize UnixCCompiler libraries tatlas,tatlas not found in /usr/lib/ NOT AVAILABLE atlas_3_10_info: customize UnixCCompiler libraries lapack_atlas not found in /home/rgommers/anaconda3/envs/dummy/lib customize UnixCCompiler libraries satlas,satlas not found in /home/rgommers/anaconda3/envs/dummy/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/local/lib customize UnixCCompiler libraries satlas,satlas not found in /usr/local/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/lib64 customize UnixCCompiler libraries satlas,satlas not found in /usr/lib64 customize UnixCCompiler libraries lapack_atlas not found in /usr/lib customize UnixCCompiler libraries satlas,satlas not found in /usr/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/lib/ customize UnixCCompiler libraries satlas,satlas not found in /usr/lib/ NOT AVAILABLE atlas_threads_info: Setting PTATLAS=ATLAS customize UnixCCompiler libraries lapack_atlas not found in /home/rgommers/anaconda3/envs/dummy/lib customize UnixCCompiler libraries ptf77blas,ptcblas,atlas not found in /home/rgommers/anaconda3/envs/dummy/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/local/lib customize UnixCCompiler libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/lib64 customize UnixCCompiler libraries ptf77blas,ptcblas,atlas not found in /usr/lib64 customize UnixCCompiler libraries lapack_atlas not found in /usr/lib customize UnixCCompiler libraries ptf77blas,ptcblas,atlas not found in /usr/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/lib/ customize UnixCCompiler libraries ptf77blas,ptcblas,atlas not found in /usr/lib/ NOT AVAILABLE atlas_info: customize UnixCCompiler libraries lapack_atlas not found in /home/rgommers/anaconda3/envs/dummy/lib customize UnixCCompiler libraries f77blas,cblas,atlas not found in /home/rgommers/anaconda3/envs/dummy/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/local/lib customize UnixCCompiler libraries f77blas,cblas,atlas not found in /usr/local/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/lib64 customize UnixCCompiler libraries f77blas,cblas,atlas not found in /usr/lib64 customize UnixCCompiler libraries lapack_atlas not found in /usr/lib customize UnixCCompiler libraries f77blas,cblas,atlas not found in /usr/lib customize UnixCCompiler libraries lapack_atlas not found in /usr/lib/ customize UnixCCompiler libraries f77blas,cblas,atlas not found in /usr/lib/ NOT AVAILABLE accelerate_info: NOT AVAILABLE lapack_info: customize UnixCCompiler customize UnixCCompiler FOUND: libraries = ['lapack', 'lapack'] library_dirs = ['/usr/lib64'] language = f77 FOUND: libraries = ['lapack', 'lapack', 'openblas', 'openblas'] library_dirs = ['/usr/lib64'] language = c define_macros = [('HAVE_CBLAS', None), ('NO_ATLAS_INFO', 1)] non-existing path in 'numpy/random': 'src/splitmix64/splitmix.h' /home/rgommers/anaconda3/envs/dummy/lib/python3.7/distutils/dist.py:274: UserWarning: Unknown distribution option: 'define_macros' warnings.warn(msg) running build_ext running build_src build_src building py_modules sources building library "npymath" sources get_default_fcompiler: matching types: '['gnu95', 'intel', 'lahey', 'pg', 'absoft', 'nag', 'vast', 'compaq', 'intele', 'intelem', 'gnu', 'g95', 'pathf95', 'nagfor']' customize Gnu95FCompiler Found executable /usr/bin/gfortran customize Gnu95FCompiler customize Gnu95FCompiler using config C compiler: gcc -pthread -B /home/rgommers/anaconda3/envs/dummy/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC compile options: '-Inumpy/core/src/common -Inumpy/core/src -Inumpy/core -Inumpy/core/src/npymath -Inumpy/core/src/multiarray -Inumpy/core/src/umath -Inumpy/core/src/npysort -I/home/rgommers/anaconda3/envs/dummy/include/python3.7m -c' gcc: _configtest.c gcc -pthread -B /home/rgommers/anaconda3/envs/dummy/compiler_compat -Wl,--sysroot=/ _configtest.o -o _configtest /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: _configtest.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: _configtest.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: _configtest.o: unable to initialize decompress status for section .debug_info /home/rgommers/anaconda3/envs/dummy/compiler_compat/ld: _configtest.o: unable to initialize decompress status for section .debug_info _configtest.o: file not recognized: file format not recognized collect2: error: ld returned 1 exit status failure. removing: _configtest.c _configtest.o _configtest.o.d Traceback (most recent call last): File "setup.py", line 443, in setup_package() File "setup.py", line 435, in setup_package setup(**metadata) File "/home/rgommers/code/tmp/realtmp/numpy/numpy/distutils/core.py", line 171, in setup return old_setup(**new_attr) File "/home/rgommers/anaconda3/envs/dummy/lib/python3.7/site-packages/setuptools/__init__.py", line 145, in setup return distutils.core.setup(**attrs) File "/home/rgommers/anaconda3/envs/dummy/lib/python3.7/distutils/core.py", line 148, in setup dist.run_commands() File "/home/rgommers/anaconda3/envs/dummy/lib/python3.7/distutils/dist.py", line 966, in run_commands self.run_command(cmd) File "/home/rgommers/anaconda3/envs/dummy/lib/python3.7/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/home/rgommers/code/tmp/realtmp/numpy/numpy/distutils/command/build_ext.py", line 79, in run self.run_command('build_src') File "/home/rgommers/anaconda3/envs/dummy/lib/python3.7/distutils/cmd.py", line 313, in run_command self.distribution.run_command(command) File "/home/rgommers/anaconda3/envs/dummy/lib/python3.7/distutils/dist.py", line 985, in run_command cmd_obj.run() File "/home/rgommers/code/tmp/realtmp/numpy/numpy/distutils/command/build_src.py", line 142, in run self.build_sources() File "/home/rgommers/code/tmp/realtmp/numpy/numpy/distutils/command/build_src.py", line 153, in build_sources self.build_library_sources(*libname_info) File "/home/rgommers/code/tmp/realtmp/numpy/numpy/distutils/command/build_src.py", line 286, in build_library_sources sources = self.generate_sources(sources, (lib_name, build_info)) File "/home/rgommers/code/tmp/realtmp/numpy/numpy/distutils/command/build_src.py", line 369, in generate_sources source = func(extension, build_dir) File "numpy/core/setup.py", line 669, in get_mathlib_info raise RuntimeError("Broken toolchain: cannot link a simple C program") RuntimeError: Broken toolchain: cannot link a simple C program ```

Expected Behavior

compile should work

Steps to Reproduce

See above. I have seen this on a number of Linux versions (Arch, Antergos, Manjaro at least) and GCC versions.

Anaconda or Miniconda version:

Anaconda3-2019.03-Linux-x86_64.sh

Operating System:

64-bit Linux (Arch, but happens on other distros as well)

conda info
``` $ conda info active environment : dummy active env location : /home/rgommers/anaconda3/envs/dummy shell level : 2 user config file : /home/rgommers/.condarc populated config files : conda version : 4.7.10 conda-build version : 3.18.8 python version : 3.7.3.final.0 virtual packages : __cuda=10.1 base environment : /home/rgommers/anaconda3 (writable) channel URLs : https://repo.anaconda.com/pkgs/main/linux-64 https://repo.anaconda.com/pkgs/main/noarch https://repo.anaconda.com/pkgs/r/linux-64 https://repo.anaconda.com/pkgs/r/noarch package cache : /home/rgommers/anaconda3/pkgs /home/rgommers/.conda/pkgs envs directories : /home/rgommers/anaconda3/envs /home/rgommers/.conda/envs platform : linux-64 user-agent : conda/4.7.10 requests/2.22.0 CPython/3.7.3 Linux/5.2.1-arch1-1-ARCH antergos/ glibc/2.29 UID:GID : 1000:985 netrc file : None offline mode : False ```
conda list --show-channel-urls
``` $ conda list --show-channel-urls # packages in environment at /home/rgommers/anaconda3/envs/dummy: # # Name Version Build Channel _libgcc_mutex 0.1 main defaults ca-certificates 2019.5.15 0 defaults certifi 2019.6.16 py37_0 defaults cython 0.29.12 py37he6710b0_0 defaults libedit 3.1.20181209 hc058e9b_0 defaults libffi 3.2.1 hd88cf55_4 defaults libgcc-ng 9.1.0 hdf63c60_0 defaults libstdcxx-ng 9.1.0 hdf63c60_0 defaults ncurses 6.1 he6710b0_1 defaults openssl 1.1.1c h7b6447c_1 defaults pip 19.1.1 py37_0 defaults python 3.7.3 h0371630_0 defaults readline 7.0 h7b6447c_5 defaults setuptools 41.0.1 py37_0 defaults sqlite 3.29.0 h7b6447c_0 defaults tk 8.6.8 hbc83047_0 defaults wheel 0.33.4 py37_0 defaults xz 5.2.4 h14c3975_4 defaults zlib 1.2.11 h7b6447c_3 defaults ```
rgommers commented 5 years ago

Here's another report from a week ago that also says "removing ld solves the issue: https://github.com/conda/conda/issues/6030

jjhelmus commented 5 years ago

There is a tricky issue.

We include compiler_compat/ld so that users can use their system compiler to build a Python extension and link against libraries provided by conda packages. The system linker may not be new enough to understand the relocation present in these libraries, for example see @msarahan's comment in conda/conda#6030

On the other hand newer system compilers can produce objects that cannot be understood by compiler_compat/ld.

@msarahan, @mingwandroid and I have been talking about the best approach to take here. One option we have been considering is to examine the system linker and use it if it is newer than the compiler_compat/ld version.

mingwandroid commented 5 years ago

I'm looking at this now. Another option is to put this linker binary into a separate package that gets recreated each time we update our compilers, then we get on the compiler-rebuild treadmill.

That treadmill isn't too bad, and in fact we've been built GCC 8 and GCC 9 packages at Anaconda already but due to gfortran ABI incompatibilities we've been unable to use them.

mingwandroid commented 5 years ago

@rgommers, this is the sharp end of the interface between conda software and the rest of the software world, for sure. Can I ask you though, how common is this, and why is it common? Do you feel it should be? I certainly do not!

Note, I'm not suggesting we shouldn't do our best here, but there will be costs.

mingwandroid commented 5 years ago

To be clear, if people are linking C/C++ Python extension modules using the Anaconda Distribution (or conda-forge) Pythons, then they should be also using our C/C++ compilers.

I believe that is a reasonable statement.

mingwandroid commented 5 years ago

Hi @rgommers and @conda-forge/core (why does this not work?!), @jjhelmus, what do you think of this (it is very much WIP)?

#!/usr/bin/env bash

function ld_fn()
{
  if [[ ${CC} =~ .*conda.* ]]; then
    # Using conda compilers. Good.
    $(dirname "${BASH_SOURCE[0]}")/ld.bin "$@"
  else
    local _GCC=gcc
    if [[ -n ${CC} ]]; then
      _GCC="${CC}"
    fi
    # Is it just GCC being too new? Not really. It is that GCC is
    # compressing its debugging sections and our linker knows nothing
    # about that. Can we detect whether GCC does that or not instead?
    local GCC_SYSV=$("${_GCC}" --version)
    GCC_SYSV=${GCC_SYSV##* }
    if [[ ${GCC_SYSV} =~ 1[.].* ]] ||
       [[ ${GCC_SYSV} =~ 2[.].* ]] ||
       [[ ${GCC_SYSV} =~ 3[.].* ]] ||
       [[ ${GCC_SYSV} =~ 4[.].* ]] ||
       [[ ${GCC_SYSV} =~ 5[.].* ]] ||
       [[ ${GCC_SYSV} =~ 6[.].* ]] ||
       [[ ${GCC_SYSV} =~ 7[.].* ]]; then
      $(dirname "${BASH_SOURCE[0]}")/ld.bin "$@"
    else
      "${_GCC}" "$@"
    fi
  fi
}

_COMPILER_COMPAT_LD_ARGS=( "$@" )
ld_fn "${_COMPILER_COMPAT_LD_ARGS[@]}"

My main concern with this approach is if anyone attempts to load ld as a binary file or to execute it in some weird way then things could fail. Clearly it needs testing! The other alternative is for us to churn out compilers at a constant rate and split ld-compat/ld into a separate package created at each compiler rebuild, still our old Pythons would remain incompatible with newer systems (and maybe become incompatible at update-time) for people pushing at this compatibility boundary .. instead of using our compilers.

mingwandroid commented 5 years ago

To test, rename the existing binary to ld.bin and save this as a new ld and set it to be executable.

mingwandroid commented 5 years ago

If anyone feels this dynamic behaviour switch could/should be moved to distutils then please shout!

mingwandroid commented 5 years ago

Oh @rgommers, since conda-forge is considered our upstream and filing the issue here seems to prevent me from pinging some people, would it be OK for me to move this issue to conda-forge/python-feedstock where it will get more community visibility?

jjhelmus commented 5 years ago

@mingwandroid I like the bash script to dynamically detect the version of gcc and delegate to the appropriate linker.

What do you think about implementing this check in the _sysconfigdata..py file? Some Python logic could examine the CC variable and the gcc version and either load or modify the build_time_vars dictionary.

I think the shell script option is more straightforward but wanted to offer an alternative.

Either method should include a override to select either the system or compiler_compat linked for debugging purposed. Adding a check for a _CONDA_PYTHON_USE_VENDORED_LD environment variable or something similar seems like an option.

jjhelmus commented 5 years ago

@conda-forge/python for visibility

mingwandroid commented 5 years ago

These are all good ideas @jjhelmus, thank you. I will go ahead and implement it in sysconfigdata.

If you want to check out what the shell script version looks like it will be available as a package fairly soon via:

conda install -c rdonnelly python=3.7.4

.. the recipe is here:

[https://github.com/AnacondaRecipes/python-feedstock/tree/master-3.7.14/recipe](https://github.com/AnacondaRecipes/python-feedstock/tree/master-3.7.14/recipe)

mingwandroid commented 5 years ago

My version checking in this shell script version is not good btw.

I also wonder whether we should be more feature-oriented in our decision making here? It's some compression format on the .debug_info that we are missing. Should that be the discriminator instead? .. and in future when we step out of cutting-edge modernity due to some other feature(s) should we add detection for those too or just go with the "use the system ld when system GCC is newer (and LD is not set, I suppose!)" approach?

mingwandroid commented 5 years ago

I do not mind implementing this feature in both places too, so we can provide more workarounds, switchable via env. vars.

isuruf commented 5 years ago

The other alternative is for us to churn out compilers at a constant rate and split ld-compat/ld into a separate package created at each compiler rebuild, still our old Pythons would remain incompatible with newer systems (and maybe become incompatible at update-time) for people pushing at this compatibility boundary .. instead of using our compilers.

I like this as I don't like the fact that python package is vendoring ld. This essentially makes the conda package for python GPL.

mingwandroid commented 5 years ago

It seems my test python 3.7.4 package with the 'ld as shell script' approach works OK in @rgommers test-case.

mingwandroid commented 5 years ago

A test package which affects these changes in sysconfigdata is now also available on the rdonnelly channel (so build #0 for shell script, #1 for sysconfigdata).

This approach works OK in @rgommers test-case too.

rgommers commented 5 years ago

Sorry for the slow reply, I was away for a few days. And thanks for the detailed answers!

would it be OK for me to move this issue to conda-forge/python-feedstock where it will get more community visibility?

of course, fine with me

To be clear, if people are linking C/C++ Python extension modules using the Anaconda Distribution (or conda-forge) Pythons, then they should be also using our C/C++ compilers.

I believe that is a reasonable statement.

I don't think it is. I personally am using the conda compilers by default on one of my machines, because I want to make sure things work as expected for numpy.distutils. But the conda compilers definitely still are a niche thing with some rough edges (e.g. numpy 1.16.1-1.16.2 don't work at all, pytorch build warns due to the odd custom compiler name, etc.). Having a standard Linux box with gcc 7/8/9, installing Anaconda (which doesn't contain the compilers by default), and then having a standard python setup.py install for any package with a C extension fail unless you do conda install gcc_linux-64 first doesn't seem that reasonable.

It seems my test python 3.7.4 package with the 'ld as shell script' approach works OK in @rgommers test-case.

thanks, I'll try this!

jjhelmus commented 5 years ago

Having a standard Linux box with gcc 7/8/9, installing Anaconda (which doesn't contain the compilers by default), and then having a standard python setup.py install for any package with a C extension fail unless you do conda install gcc_linux-64 first doesn't seem that reasonable.

We want this to work as as well but we also need to support the conda provided compilers and systems with older compilers. Having python setup.py install work in all three of these cases is challenging given how distutils loads configuration information, the ever changing nature of compatibility in gcc and binutils and the variety of Linux distributions where conda is installed. Our recommendation is always going to be to use the conda provided compilers because these let us remove a large portion of this variability.

@mingwandroid's modifications in the test packages has moved this compatibility forward to a point where I think compiling C extensions should "just work" in the majority of cases but we are always looking for contributions which will further improve this compatibility.

mingwandroid commented 5 years ago

Hi @rgommers,

No problem at all on the reply. There's always plenty to keep us occupied.

I don't think it is. I personally am using the conda compilers by default on one of my machines, because I want to make sure things work as expected for numpy.distutils

But the conda compilers definitely still are a niche thing with some rough edges (e.g. numpy 1.16.1-1.16.2 don't work at all, pytorch build warns due to the odd custom compiler name, etc.).

Bug reports and/or links to them would be appreciated, sorry if these exist and I've missed them.

They are in nearly all regards, common-or-garden cross-compilers. In fact, the decision to do this was, I believe a good thing for software projects that care about having well crafted, portable and capable build system metadata (and/or scripts depending on the exact system). That other tools or libs need some adjustment is often synonymous with them needing to be fixed for cross-compilation in general, when that isn't the case (we have one point of difference on Linux from common-or-garden but it is minor [1]) we will try to make the build system accomodate us. I think the CMake team may be considering adding support for our compilers, for example. I also believe our compilers (now largely being progressed by @isuruf thankfully) to be fundamental to a 'better' way of computing than traditional Linux, flatpak or snap (not to mention the other OSes) though I'm not going to try to sell you those arguments here!

Having a standard Linux box with gcc 7/8/9, installing Anaconda (which doesn't contain the compilers by default), and then having a standard python setup.py install for any package with a C extension fail unless you do conda install gcc_linux-64 first doesn't seem that reasonable.

Do you agree with my claim that since we use shared libraries (and cannot and should not budge on that! Security matters, static linking is insecure, and inefficient) and out of necessity we need newer versions of Fortran, C++ and GCC than our minimum supported OSes provide by default (and they will dynamically link to/autoload their transitive system dependencies too, where they too will get loaded preferentially, except when SONAME has been mangled by some tool to prevent that particular strcmp() - or whatever - from returning 0) we are forced to build new compilers and new language runtime libraries to go along with those.

If you agree with that I'd love to hear any ideas for solutions I can implement that do not involve adding compilers as dependencies of our conda Python packages (which is an option open to us still IMHO, maybe via metapackages such as python-cxx-devel). In fact, I should note that our r-base package does depend upon our compilers on all 3 OSes (MSYS2/mingw-w64 ones on Windows) and I prefer that approach, at least for R. Some people disagree, and may even forcibly uninstall them, but IMHO they are ignoring how operating system loaders work and our recommendations as encoded in our dependencies and doing that at their own risk (another caveat, we have work to do to improve conda skeleton cran around system libs and I may investigate hooking the install.packages() command some day, as I believe RStudio do, but that's off-tangent here, sorry).

Things need to work and they need to work well. Edge cases are not really something we can devote a lot of time to, we do intend our Pythons to be usable with system compilers if they are newer and ABI and whatever-special-elf-or-dylib-sections-we-need-in-our-binaries and using them does not pull in incompatible libraries, hence our willingness to explore some pretty egregious hacks like the two I have implemented here.

We cannot be considered completely responsible though for when our software links to "other people's software (and maybe even pulls in a load of other system libs that are incompatible with conda libs)" and things go wrong, therefore I cannot recommend it (but will try to fix it). Still, the message should be loud and clear. This is risky stuff.

BTW, did you try the sysconfig approach as suggested by @jjhelmus in build #1?

[1] This is to do with whether directories passed as -I/-L should have the prefix added to them or not, we say not, because the stuff we point to with -I and -L lives outside the sysroot => also note, we could explore ways to fudge that with bind mounts for example if that made things more common-or-garden, but really, that detail has always been ambiguous anyway and I don't think I added any patches to GCC to change it, we may have had to make a change to ld at some point though. I would also like to ping @nehaljwani on this subject as he looked at compilers more recently than I (in fact, @msarahan did builds of GCC 8 and @nehaljwani did builds of GCC 9 already, and we use those runtime compiler libs from GCC 9 quite happily with software compiled by GCC 7, but we're confident to do that because we are in full control of the stack).

rgommers commented 5 years ago

Bug reports and/or links to them would be appreciated, sorry if these exist and I've missed them.

No worries - I have fixed them already. I've kept a log of common issues building NumPy and SciPy with Conda compilers at https://github.com/numpy/numpy/issues/13280. I think this ld issue is the only one that's clearly a Conda issue rather than an issue in numpy.distutils or another package.

rgommers commented 5 years ago

That other tools or libs need some adjustment is often synonymous with them needing to be fixed for cross-compilation in general, when that isn't the case

Fully agree. This is part of why there are rough edges though. Cross-compiling is not commonly done, so it needs quite a bit of dogfooding before it's ready for general use. That's what I alluded to with "rough edges". All of distutils isn't really designed properly for cross-compiling.

If you agree with that I'd love to hear any ideas for solutions I can implement

Yes, I agree (and I'm sure you've thought about this much harder than I have).

Not sure it needs extremely complicated solutions - clearer error messages would help a lot. And if you have to choose between supporting older or newer gcc's, it would make more sense (imho) to require newer compilers by default. The compiler_compat/ld is now set up for old ones; instead just erroring out on old ones and let users either upgrade or explicitly install the current ld would be an improvement.

we do intend our Pythons to be usable with system compilers if they are newer and ABI and whatever-special-elf-or-dylib-sections-we-need-in-our-binaries and using them does not pull in incompatible libraries, hence our willingness to explore some pretty egregious hacks like the two I have implemented here.

This sounds very reasonable.

BTW, did you try the sysconfig approach as suggested by @jjhelmus in build #1?

not yet, that will be a weekend project

mingwandroid commented 5 years ago

All of distutils isn't really designed properly for cross-compiling

Well, our changes to sysconfigdata are specifically meant to adress these matters (so long as you use our compilers of course .. but we could think to generalize it some and see if upstream cares for this or have better ideas (beyond a new platform tag which is something we cannot contemplate)).

mingwandroid commented 5 years ago

Great, thanks for the useful discussion @rgommers. I may catch up with you on the weekend.

rgommers commented 5 years ago

Installed conda install -c rdonnelly python in a new env.

In the root of the numpy repo, doing python setup.py build_ext -i finished, but running the tests or importing numpy fails. The (or a) reason is that the ABI flag looks wrong, example:

numpy/random/philox.cpython-@PYVERNODOTS@m-x86_64-linux-gnu.so
NightMachinery commented 5 years ago

I just had this issue on a new install of miniconda with gcc 9 on ubuntu.

...
  /home/eva/miniconda3/compiler_compat/ld: build/temp.linux-x86_64-3.7/regex_3/_regex.o: unable to initialize decompress status for section .debug_info
  build/temp.linux-x86_64-3.7/regex_3/_regex.o: file not recognized: file format not recognized
  collect2: error: ld returned 1 exit status
  error: command 'gcc' failed with exit status 1
  ----------------------------------------
  ERROR: Failed building wheel for regex
  Running setup.py clean for regex
Failed to build regex
...
chux0519 commented 5 years ago

I just had this issue on a new install of miniconda with gcc 9 on ubuntu.

...
  /home/eva/miniconda3/compiler_compat/ld: build/temp.linux-x86_64-3.7/regex_3/_regex.o: unable to initialize decompress status for section .debug_info
  build/temp.linux-x86_64-3.7/regex_3/_regex.o: file not recognized: file format not recognized
  collect2: error: ld returned 1 exit status
  error: command 'gcc' failed with exit status 1
  ----------------------------------------
  ERROR: Failed building wheel for regex
  Running setup.py clean for regex
Failed to build regex
...

you may try to rename /home/eva/miniconda3/compiler_compat/ld to ld_old. or just delete it. I'm using arch, and this fixed my problem

NightMachinery commented 5 years ago

Yeah, I did exactly that, too. I just wanted to chime in so that this issue gets more attention. Truth be told, I have found conda to be a powerful but buggy tool and no longer recommend it to beginners.

On Fri, Sep 6, 2019 at 10:48 AM Yongsheng Xu notifications@github.com wrote:

I just had this issue on a new install of miniconda with gcc 9 on ubuntu.

... /home/eva/miniconda3/compiler_compat/ld: build/temp.linux-x86_64-3.7/regex_3/_regex.o: unable to initialize decompress status for section .debug_info build/temp.linux-x86_64-3.7/regex_3/_regex.o: file not recognized: file format not recognized collect2: error: ld returned 1 exit status error: command 'gcc' failed with exit status 1

ERROR: Failed building wheel for regex Running setup.py clean for regex Failed to build regex ...

you may try to rename /home/eva/miniconda3/compiler_compat/ld to ld_old. or just delete it. I'm using arch, and this fixed my problem

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/ContinuumIO/anaconda-issues/issues/11152?email_source=notifications&email_token=AIUL56XCSKBLIEJDBQQRIM3QIHY3ZA5CNFSM4IHHXD42YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD6B3PWA#issuecomment-528725976, or mute the thread https://github.com/notifications/unsubscribe-auth/AIUL56WB2M74IUXW4VDFNMTQIHY3ZANCNFSM4IHHXD4Q .

mingwandroid commented 5 years ago

This has been fixed in python 3.7.4 already (for a long time). conda update python=3.7.4 should see you right. Let me know if that's not the case.

@NightMachinary good luck with whatever distro you settle on, they're all buggy (and conda has features far beyond most), that's the nature of software distributions. The environment is utterly chaotic.

NightMachinery commented 5 years ago

This has not been fixed in Python 3.7.3 (I had just installed miniconda3 on the day I commented.). I’m happy with conda, but it remains that it breaks way too often to be suitable for beginners. Simpler tools with humbler ambitions will probably fare better for that use case.

On Fri, Sep 6, 2019 at 2:03 PM Ray Donnelly notifications@github.com wrote:

This has been fixed in python 3.7.

@NightMachinary https://github.com/NightMachinary good luck with whatever distro you settle on, they're all buggy, that's the nature of software distributions. The environment is utterly chaotic.

— You are receiving this because you were mentioned.

Reply to this email directly, view it on GitHub https://github.com/ContinuumIO/anaconda-issues/issues/11152?email_source=notifications&email_token=AIUL56WFHXJWSRSIXS6ABU3QIIPVDA5CNFSM4IHHXD42YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD6CJ46Q#issuecomment-528785018, or mute the thread https://github.com/notifications/unsubscribe-auth/AIUL56WLTACPG6BNR2YM5SDQIIPVDANCNFSM4IHHXD4Q .

rgommers commented 5 years ago

This has been fixed in python 3.7.4 already (for a long time). conda update python=3.7.4 should see you right. Let me know if that's not the case.

Nope, still broken in exactly the same way for python 3.7.4 h265db76_1, which is the most recent version as of right now.

Please let me know if you need more details or testing @mingwandroid.

I’m happy with conda, but it remains that it breaks way too often to be suitable for beginners. Simpler tools with humbler ambitions will probably fare better for that use case.

@NightMachinary the same can be said of pretty much any other tool. Just downloading the Anaconda distribution is still a very easy way to get started in Python land. That said, I would like to point out that comments like yours aren't exactly motivating for developers of an open source tool. Sometimes I'm frustrated too (with conda, as well as pip, linux, macOS, etc.), but I just take a deep breath and go for a walk, and then try to submit a constructive bug report or suggestion (or just let it go if I don't have time). I kindly suggest you to do the same.

NightMachinery commented 5 years ago

True, true. :)

On Fri, Sep 6, 2019 at 11:36 PM Ralf Gommers notifications@github.com wrote:

This has been fixed in python 3.7.4 already (for a long time). conda update python=3.7.4 should see you right. Let me know if that's not the case.

Nope, still broken in exactly the same way for python 3.7.4 h265db76_1, which is the most recent version as of right now.

Please let me know if you need more details or testing @mingwandroid https://github.com/mingwandroid.

I’m happy with conda, but it remains that it breaks way too often to be suitable for beginners. Simpler tools with humbler ambitions will probably fare better for that use case.

@NightMachinary https://github.com/NightMachinary the same can be said of pretty much any other tool. Just downloading the Anaconda distribution is still a very easy way to get started in Python land. That said, I would like to point out that comments like yours aren't exactly motivating for developers of an open source tool. Sometimes I'm frustrated too (with conda, as well as pip, linux, macOS, etc.), but I just take a deep breath and go for a walk, and then try to submit a constructive bug report or suggestion (or just let it go if I don't have time). I kindly suggest you to do the same.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ContinuumIO/anaconda-issues/issues/11152?email_source=notifications&email_token=AIUL56QUVZGGPL2ZEDYPUOLQIKS33A5CNFSM4IHHXD42YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD6DY4JQ#issuecomment-528977446, or mute the thread https://github.com/notifications/unsubscribe-auth/AIUL56QDIW3AQXNT7A36ACLQIKS33ANCNFSM4IHHXD4Q .

JAngiolillo commented 5 years ago

I've been dealing with a gcc problem, that I think is related to this thread, and it's not resolved after performing.

conda install -c rdonnelly python=3.7.4

I'm using conda 4.7.12 and python 3.7.4.

My problem is that I have two identical .pyx files, based on the result of this command,

$ diff test.pyx test1.pyx

yet I can import only test.pyx, but not test1.pyx after running:

>>> import cython;import pyximport;pyximport.install()
(None, <pyximport.pyximport.PyxImporter object at 0x7fcacf6919d0>)

Both files are just:

import numpy
cimport numpy
cimport cython

Running this yields:

>>> import test1
...
/home/me/.pyxbld/temp.linux-x86_64-3.7/pyrex/test1.c:598:10: fatal error: numpy/arrayobject.h: No such file or directory
 #include "numpy/arrayobject.h"
          ^~~~~~~~~~~~~~~~~~~~~ 
...
ImportError: Building module test1 failed: ["distutils.errors.CompileError: command 'gcc' failed with exit status 1\n"]

if I remember right, I created both test.pyx and test1.pyx in vi, though it's possible I didn't create them within the same virtual environment. I don't understand the importance of this, but I've realized test.pyx doesn't have a hidden .c file in the location where the one that doesn't work has one.

me@my_machine:~$ ls .pyxbld/temp.linux-x86_64-3.7/pyrex/ -a
.  ..   test1.c  _test.c  tested.c

Is this related to the problem you've all been trying to fix?

rgommers commented 5 years ago

@JAngiolillo that's not related, and likely not even a bug. Just delete that generated test1.c file. If it still fails, try slimming it down (e.g. does it still fail if you delete test.pyx). And if it's a pyximport bug, you will need to ask on the Cython issue tracker.

JAngiolillo commented 5 years ago

deleting the whole directory .pyxbld fails to fix it. recreating the file doesn't either. somehow that one file test.pyx works whereas every subsequent .pyx fails.

(also, i think i've realized that import testreferences a test module in the python3.7 directory rather than my local test.pyx file.)

will move to cython issue tracker. thanks.

ocehugo commented 5 years ago

I am just passing to say that I encountered a recurrent problem with miniconda3.

Installing software with pip in miniconda3 uses the compiler_compat/ld binary. This binutils binary is incompatible with new packages, such as binutils-2.32, elfutils-0.176 and gcc-8.3.0.

If your system is updated with the packages above and you try to pip install any package that uses GCC with debug information you get something in the lines of:

gcc ... -DNDEBUG ...
 unable to initialize decompress status for section .debug_info

I solved by forcing a symbolic link to my system ld instead of the defunct and outdated ld within compiler_compat folder.

peschue commented 4 years ago

I can confirm that this issue can be solved by renaming the compiler_compat/ld for doing a pip install and then renaming it back. I ran into the issue while buildling jpype in a conda environment.

Thank you for providing a solution!

rgommers commented 4 years ago

Any updates on this? At this point I'd be happy with a simple env var that prevents installing compiler_compat/ld or deletes it at the end of any new env creation.

Another place I came across this, it's in the FAQ of pytorch-geometric

alexkolo commented 4 years ago

I'm not sure if this is relevant or applicable here but I just like to point out that I had a similar issue (see https://github.com/AtomDB/pyatomdb/issues/18 ), and I fixed it by installing anacondas most recent compiler before the package installation: conda install -c anaconda gcc_linux-64

The explanation in https://github.com/pytorch/pytorch/issues/16683#issuecomment-459982988 explains, why this fix worked and I would naively think that it is probably also the safer fix in comparison to temporarily renaming conda's own ld.

rgommers commented 4 years ago

@alexkolo thanks for the suggestion. I know that's an option; it's not really relevant here since the whole point of compiler_compat/ld is to provide compatibility with system compilers.

jjhelmus commented 4 years ago

Any updates on this?

@mingwandroid and I (but mostly Ray) were exploring two possible options.

The first was making the vendored ld a shell script that would examine the version of gcc to determine which linker, the system or the one for compatibility, to call.

The second, added similar logic to the _sysconfigdata file itself to strip out the -B PREFIX/compiler_compat options depending on the gcc version.

Neither of these was complete enough to include in the packages in defaults. If anyone wants to explore those more they seem promising.

With the current packages it should be possible to specify a custom sysconfigdata file by pointing the _PYTHON_SYSCONFIGDATA_NAME variable at it.

Alternatively, if the CC, CXX and LDSHARED variable are set they should replace the defaults which come from the included sysconfigdata file which contain the -B PREFIX... option and cause the use of compiler_compat/ld.

isuruf commented 4 years ago

3rd option is what conda-forge does. Ship the latest binutils version (2.33.1)

jjhelmus commented 4 years ago

3rd option is what conda-forge does. Ship the latest binutils version (2.33.1)

I really like this idea. I'm curious if it solves the issue. I have not been able to replicate the issue locally recently so I can't test if updating binutils is sufficient. @rgommers or others do you have a sample case that I can use to test?

jjhelmus commented 4 years ago

I've been able to reproduce the original issue and can confirm that the conda-forge Python package which has the binutils (2.33.1) does address the issue.

I've been using the following commands as a test. This creates a new conda environment and builds NumPy 1.17.4 and then runs the test suite:

conda create -y -n test python=3.7 pip
conda activate test

wget --quiet https://github.com/numpy/numpy/releases/download/v1.17.4/numpy-1.17.4.tar.gz
tar xf numpy-1.17.4.tar.gz
cd numpy-1.17.4
pip install -vvv -e .

pip install pytest
python -c "import numpy; numpy.test()"

This fails with packages from defaults in a Arch Linux docker container with gcc 9.2.0 and binutils 2.33.1 with:

...
   gcc -pthread -B /opt/conda/envs/test/compiler_compat -Wl,--sysroot=/ _configtest.o -o _configtest                                                                                   /opt/conda/envs/test/compiler_compat/ld: _configtest.o: unable to initialize decompress status for section .debug_info                                                          
    /opt/conda/envs/test/compiler_compat/ld: _configtest.o: unable to initialize decompress status for section .debug_info                                                              /opt/conda/envs/test/compiler_compat/ld: _configtest.o: unable to initialize decompress status for section .debug_info                                                          
    /opt/conda/envs/test/compiler_compat/ld: _configtest.o: unable to initialize decompress status for section .debug_info                                                              _configtest.o: file not recognized: file format not recognized                                                                                                                  
    collect2: error: ld returned 1 exit status                                                                            
...

If conda-forge's packages are used, (conda create -n test python pip -c conda-forge) the build and tests pass.

Setting LDSHARED and CC to override the -B versions of these variables from Python's sysconfig file can be done using:

export LDSHARED="gcc -pthread -shared -L/opt/conda/envs/test/lib -Wl,-rpath=/opt/conda/envs/test/lib -Wl,--no-as-needed -Wl,--sysroot=/"
export CC="gcc"

With these variable set the build completes and the test pass using packages from defaults.

Arch Linux Dockerfile ``` FROM archlinux:20191105 ENV PATH /opt/conda/bin:$PATH RUN pacman --noconfirm --needed -Syu \ wget \ tar \ bzip2 RUN wget --quiet https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh -O ~/miniconda.sh && \ /bin/bash ~/miniconda.sh -b -p /opt/conda && \ rm ~/miniconda.sh && \ ln -s /opt/conda/etc/profile.d/conda.sh /etc/profile.d/conda.sh && \ echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc && \ echo "conda activate base" >> ~/.bashrc RUN pacman --noconfirm --needed -Syu \ gcc CMD [ "/bin/bash" ] ```

For another data point, the build and tests pass using packages from either channel if run in a Ubuntu 19.10 docker container with the gcc 9.2.1 and ld 2.33.

Ubuntu 19.10 Dockerfile ``` Dockerfile FROM ubuntu:19.10 ENV LANG=C.UTF-8 LC_ALL=C.UTF-8 ENV PATH /opt/conda/bin:$PATH RUN apt-get update --fix-missing && \ apt-get install -y wget bzip2 ca-certificates libglib2.0-0 libxext6 libsm6 libxrender1 git mercurial subversion RUN wget --quiet https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh -O ~/miniconda.sh && \ /bin/bash ~/miniconda.sh -b -p /opt/conda && \ rm ~/miniconda.sh && \ ln -s /opt/conda/etc/profile.d/conda.sh /etc/profile.d/conda.sh && \ echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc && \ echo "conda activate base" >> ~/.bashrc RUN apt-get install -y gcc unzip CMD [ "/bin/bash" ] ```

The error message itself seems similar to one described on gentoo wiki for upgrading binutils 2.32 which suggests this may be a result of a bug in binutils that was fixed in 2.32 (the compiler_comp ld is 2.31.1) along with interactions with elfutils >=0.175.

I'm still puzzled as to why everything works on Ubuntu 19.10.

bdice commented 4 years ago

Similar issue here: my environment's ld wasn't as new as my system's ld, causing issues like those reported above:

/.../miniconda3/envs/my_env/compiler_compat/ld: build/temp.linux-x86_64-3.7/freud/box.o: unable to initialize decompress status for section .debug_info
build/temp.linux-x86_64-3.7/freud/box.o: file not recognized: file format not recognized
collect2: error: ld returned 1 exit status
error: command '/usr/bin/g++' failed with exit status 1

Environment compiler_compat/ld: GNU ld (crosstool-NG 1.23.0.444-4ea7) 2.31.1 System ld: GNU ld (GNU Binutils) 2.33.1

Renaming my environment's compiler_compat/ld to compiler_compat/ld_old fixed the problem and allowed me to build Cython packages again.

From a user perspective, I hope my process in attempting to resolve this problem can help to improve the corresponding docs. I saw the error came from .../compiler_compat/ld, so I realized that my system's ld wasn't being used. Then I read through quite a bit of documentation and did several internet searches to figure out if there was a way to update the compiler_compat. If my understanding is correct, the compiler_compat is set by the version of python in the environment? I didn't know that until finding this thread. If that's accurate, that would be great to mention in the compiler_compat/README file. I thought there would be a package called compiler_compat, or that it would be resolved by conda update --all.

Also, the compiler_compat/README file points to https://github.com/conda/conda/issues/6030 but I think this issue page was more helpful in identifying the problem and solutions.

jjhelmus commented 4 years ago

The latest release of Python 3.x on the defaults channel, 3.6.10, 3.7.6 and 3.8.1, use a symlink to the executable ld_impl_ package for compiler_compat/ld as conda-forge uses. The ld_impl_ packages are from the latest release of binutils.

I've tested these Python packages with the Arch + NumPy example from above and there were no linker issues.

I think this is a reasonable fix for this issue, big thanks to @isuruf for working on it in conda-forge.

I'm going to close this issue. If there are still problems with compiler_compat/ld with these new python packages please comment in this issue and I will re-open the issue.

rgommers commented 4 years ago

Thanks @jjhelmus! Apologies for the delayed reply; I confirm it's working as expected now.

isuruf commented 4 years ago

@jjhelmus, I think I only fixed 3.7 and 3.8 on conda-forge. Can you send your changes for 3.6?

fernandocamargoai commented 4 years ago

I've just had this issue when creating an environment. It seems that the environment also has its own compiler_compat/ld. I had to rename it to let the pip use my system's ld, which is newer.

Update: Putting "gcc_linux-64" in the list of dependencies of the environment.yml solves the problem.

link89 commented 1 year ago

A brief summary of potential solutions for this issue.

Quick Workaround

cd /path/to/your/conda/env/compiler_compat/ && mv ld ld.bak
# or just remove it if you don't care.

Recommanded ?

https://github.com/ContinuumIO/anaconda-issues/issues/11152#issuecomment-573120962

conda install -c conda-forge ld_impl_linux-64  # Modify the suffix to correspond with your platform

I have tested both solutions and they do solve my problem.