bryancatanzaro / copperhead

Data Parallel Python
Apache License 2.0
207 stars 28 forks source link

Installing without siteconf.py leads to error. #1

Closed adammathys closed 12 years ago

adammathys commented 12 years ago

Despite have boost_python installed in the default location, I am unable to compile any Copperhead code after installation. During installation, it seems to find the boost libraries. I do get:

Checking g++ version... (cached) 4.5.4
Checking nvcc version... (cached) 4.1

*************** siteconf.py not found ***************
We will try building anyway, but may not succeed.
Read the README for more details.

Checking for C++ library python2.7... (cached) yes
Checking for C++ library boost_python... (cached) yes

Which leads me to believe that it knows where to look to find the boost_python libraries. However, if I try to run the axpy.py sample I get:

#define BOOST_PYTHON_MAX_ARITY 10
#include <boost/python.hpp>
#include <cunp.hpp>
#include <make_cuarray.hpp>
#include <make_cuarray_impl.hpp>
#include <make_sequence.hpp>
#include <make_sequence_impl.hpp>
#include <cuda.h>
namespace _axpyFnTupleFloat32SeqFloat32SeqFloat32SeqFloat32 {
sp_cuarray wrap_axpy(PyObject* _a, sp_cuarray ary_x, sp_cuarray ary_y);
}
using namespace _axpyFnTupleFloat32SeqFloat32SeqFloat32SeqFloat32;
template sequence<float> make_sequence<sequence<float> >(sp_cuarray&, bool, bool);
template sp_cuarray make_cuarray<float>(size_t);

BOOST_PYTHON_MODULE(module)
{
  boost::python::def("_axpy", &wrap_axpy);
}
#include <cunp.hpp>
#include <make_cuarray.hpp>
#include <make_sequence.hpp>
#include <cuda/prelude.h>

#include "cuda/thrust_wrappers/map.h"

namespace _axpyFnTupleFloat32SeqFloat32SeqFloat32SeqFloat32 {
template<typename a>
__device__ a _triad(a _xi, a _yi, a _K0) {
    typedef a T_xi;
    typedef a T_yi;
    typedef a T_K0;
    typedef a Te0;
    Te0 e0 = op_mul(_K0, _xi);
    typedef a Tresult;
    Tresult result = op_add(e0, _yi);
    return result;
}

template<typename a>
struct fn_triad {
    typedef a result_type;
    __device__ a operator()(a _xi, a _yi, a _K0) {
        typedef a T_xi;
        typedef a T_yi;
        typedef a T_K0;
        return _triad(_xi, _yi, _K0);
    }

};

sp_cuarray _axpy(float _a, stored_sequence<float> _x, stored_sequence<float> _y) {
    typedef float T_a;
    typedef stored_sequence<float> T_x;
    typedef stored_sequence<float> T_y;
    typedef transformed_sequence<closure1<float, fn_triad<float> >, thrust::tuple<T_x, T_y> > Tresult;
    Tresult result = map2(closure1<float, fn_triad<float > >(_a, fn_triad<float >()), _x, _y);
    typedef sp_cuarray Tarycompresult;
    Tarycompresult arycompresult = phase_boundary(result);
    typedef stored_sequence<float> Tcompresult;
    Tcompresult compresult = make_sequence<sequence<float> >(arycompresult, false, true);
    return arycompresult;
}

sp_cuarray wrap_axpy(PyObject* _a, sp_cuarray ary_x, sp_cuarray ary_y) {
    sp_cuarray result = _axpy(unpack_scalar_float(_a), make_sequence<sequence<float> >(ary_x, false, false), make_sequence<sequence<float> >(ary_y, false, false));
    return result;
}

}

ERROR during compilation in make_binary

ERROR during compilation in binarize

Traceback (most recent call last):
  File "axpy.py", line 57, in <module>
    (x, y, z, zPython, error) = test_saxpy(length)
  File "axpy.py", line 41, in test_saxpy
    z = axpy(a, x, y)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/runtime/cufunction.py", line 58, in __call__
    return P.execute(self, args, kwargs)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/runtime/driver.py", line 28, in execute
    return execute(cufn, *args, **kwargs)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/runtime/driver.py", line 71, in execute
    **k)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/compiler/passes.py", line 247, in compile
    return run_compilation(target, source, M)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/compiler/passes.py", line 232, in run_compilation
    return target(suite, M)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/compiler/passes.py", line 90, in __call__
    ast = P(ast, M)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/compiler/passes.py", line 90, in __call__
    ast = P(ast, M)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/compiler/passes.py", line 197, in make_binary
    return Binary.make_binary(M)
  File "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/compiler/binarygenerator.py", line 85, in make_binary
    raise e
ImportError: /tmp/codepy-compiler-cache-v5-uid1000/d0009f81104ade7d9a86f51a465339ba/codepy.temp.d0009f81104ade7d9a86f51a465339ba.module.so: undefined symbol: _ZNK5boost6python7objects21py_function_impl_base9max_arityEv

The compiler appears to have difficult locating the correct boost libraries. Installing with a siteconf.py containing:

#!/usr/bin/python
BOOST_INC_DIR = "/usr/include"
BOOST_LIB_DIR = "/usr/lib"
BOOST_PYTHON_LIBNAME = "boost_python"

solves the issue. For comparison, the generated siteconf.py looks like this:

NP_INC_PATH = "/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/numpy/core/include"
BOOST_INC_DIR = None
BOOST_LIB_DIR = None
BOOST_PYTHON_LIBNAME = None

Perhaps having the generated siteconf.py fill in the default locations instead would work.

adammathys commented 12 years ago

I've rerun the above with the verbose option. Here's the compilation command used by Codepy:

g++ -pthread -fno-strict-aliasing -g -O2 -g -fwrapv -O2 -Wall -fPIC -std=c++0x -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Xlinker -export-dynamic -Wl,-O1 -Wl,-Bsymbolic-functions -DNDEBUG -DCUDA_SUPPORT -I/usr/include/python2.7 -I/home/adamm/.virtualenvs/au/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/prelude -I/usr/local/cuda/include /tmp/codepy-compiler-cache-v5-uid1000/d0009f81104ade7d9a86f51a465339ba/module.o /tmp/codepy-compiler-cache-v5-uid1000/b2773585709ab7f476c3ad806407d1cd/gpu.o -L/usr/lib -L/usr/local/cuda/lib -L/usr/local/cuda/lib64 -lcuda -lcudart -lpthread -ldl -lutil -lcudart -o /tmp/codepy-compiler-cache-v5-uid1000/d0009f81104ade7d9a86f51a465339ba/codepy.temp.d0009f81104ade7d9a86f51a465339ba.module.so

For comparison, here are the commands with a build using siteconf.py:

g++ -pthread -fno-strict-aliasing -g -O2 -g -fwrapv -O2 -Wall -fPIC -std=c++0x -c -DNDEBUG -DCUDA_SUPPORT -I/usr/include/python2.7 -I/home/adamm/.virtualenvs/ag/local/lib/python2.7/site-packages/copperhead/prelude -I/usr/include -I/usr/local/cuda/include /tmp/codepy-compiler-cache-v5-uid1000/2bf9f4cd60d0262f8d7821f38cad2836/module.cpp -o /tmp/codepy-compiler-cache-v5-uid1000/2bf9f4cd60d0262f8d7821f38cad2836/module.o

nvcc -Xcompiler -fPIC -arch=sm_21 -c -DNDEBUG -DCUDA_SUPPORT -U__BLOCKS__ -I/usr/include/python2.7 -I/home/adamm/.virtualenvs/ag/local/lib/python2.7/site-packages/copperhead/prelude -I/usr/include -I/home/adamm/.virtualenvs/ag/local/lib/python2.7/site-packages/numpy/core/include -I/usr/local/cuda/include /tmp/codepy-compiler-cache-v5-uid1000/7256c103cf7b8d03406fcaefc8a1e14b/gpu.cu -o /tmp/codepy-compiler-cache-v5-uid1000/7256c103cf7b8d03406fcaefc8a1e14b/gpu.o

g++ -pthread -fno-strict-aliasing -g -O2 -g -fwrapv -O2 -Wall -fPIC -std=c++0x -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Xlinker -export-dynamic -Wl,-O1 -Wl,-Bsymbolic-functions -DNDEBUG -DCUDA_SUPPORT -I/usr/include/python2.7 -I/home/adamm/.virtualenvs/ag/local/lib/python2.7/site-packages/copperhead/prelude -I/usr/include -I/usr/local/cuda/include /tmp/codepy-compiler-cache-v5-uid1000/2bf9f4cd60d0262f8d7821f38cad2836/module.o /tmp/codepy-compiler-cache-v5-uid1000/7256c103cf7b8d03406fcaefc8a1e14b/gpu.o -L/usr/lib -L/usr/local/cuda/lib -L/usr/local/cuda/lib64 -lcuda -lcudart -lboost_python -lpthread -ldl -lutil -lcudart -o /tmp/codepy-compiler-cache-v5-uid1000/2bf9f4cd60d0262f8d7821f38cad2836/codepy.temp.2bf9f4cd60d0262f8d7821f38cad2836.module.so

From the looks of it, the only significant different I can see is that when using the siteconf.py build -I/usr/include is included in the build arguments. However, it's excluded in the other build. Seeing this, I tried to install Copperhead with siteconf.py containing:

#!/usr/bin/python
BOOST_INC_DIR = "/usr/include"
BOOST_LIB_DIR = None
BOOST_PYTHON_LIBNAME =None

No luck, I get the same error message as before. I do get a slightly different compilation command:

g++ -pthread -fno-strict-aliasing -g -O2 -g -fwrapv -O2 -Wall -fPIC -std=c++0x -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Xlinker -export-dynamic -Wl,-O1 -Wl,-Bsymbolic-functions -DNDEBUG -DCUDA_SUPPORT -I/usr/include/python2.7 -I/home/adamm/.virtualenvs/zn/local/lib/python2.7/site-packages/copperhead-0.2a1-py2.7-linux-x86_64.egg/copperhead/prelude -I/usr/include -I/usr/local/cuda/include /tmp/codepy-compiler-cache-v5-uid1000/437bdfcaa8955d24032c5799e7996047/module.o /tmp/codepy-compiler-cache-v5-uid1000/bdfc2a7bee5fb4d771ee9acc4bf2a355/gpu.o -L/usr/lib -L/usr/local/cuda/lib -L/usr/local/cuda/lib64 -lcuda -lcudart -lpthread -ldl -lutil -lcudart -o /tmp/codepy-compiler-cache-v5-uid1000/437bdfcaa8955d24032c5799e7996047/codepy.temp.437bdfcaa8955d24032c5799e7996047.module.so

It now looks almost identical to the working command; the missing -I/usr/include has appeared.

bryancatanzaro commented 12 years ago

Thanks for the report. I've updated the build process significantly - I hope the new script catches the lack of configuration before building.

adammathys commented 12 years ago

Issue no longer present in most recent version of Copperhead.

bryancatanzaro commented 12 years ago

Thanks for the report!