h2oai / h2o4gpu

H2Oai GPU Edition
Apache License 2.0
460 stars 95 forks source link

Recipe for conda build #667 #671

Closed hemenkapadia closed 6 years ago

hemenkapadia commented 6 years ago

Added functionality to generate conda packages in addition to the wheel files as part of the Jenkins build process. This still does not auto deploy the conda packages to h2oai conda channel, however that can be easily implemented by providing a promote option on our internal conda server.

Issue #667

hemenkapadia commented 6 years ago

@mdymczyk and @pseudotensor ,

X86_64 build via jenkins for all cuda versions (8,9,9.2) is working for this PR.

For ppc64le we have 2 issues

  1. Multiple dependencies for ppc64le are not available in the conda environment. It will take considerable time to get all possible packages available.
  2. To get the x86_64 packages working i had to update some dependencies in requirements_build.txt, namely llvmlite from 0.20.0 to 0.21.0. However that version does not seem to be compatible with the LLVM version that is downloaded from artifacts.h2o.ai as part of the docker build step. So pip install of requirements_build.txt is failing with the below error
    /opt/h2oai/h2o4gpu/python/bin/python /tmp/pip-build-pjuilgvd/llvmlite/ffi/build.py
    17:49:51   LLVM version... 4.0.1
    17:49:51   
    17:49:51   Traceback (most recent call last):
    17:49:51     File "/tmp/pip-build-pjuilgvd/llvmlite/ffi/build.py", line 141, in <module>
    17:49:51       main()
    17:49:51     File "/tmp/pip-build-pjuilgvd/llvmlite/ffi/build.py", line 131, in main
    17:49:51       main_posix('linux', '.so')
    17:49:51     File "/tmp/pip-build-pjuilgvd/llvmlite/ffi/build.py", line 105, in main_posix
    17:49:51       raise RuntimeError(msg)
    17:49:51   RuntimeError: Building llvmlite requires LLVM 5.0.x. Be sure to set LLVM_CONFIG to the right executable path.
    17:49:51   Read the documentation at http://llvmlite.pydata.org/ for more information about building llvmlite.
    17:49:51   
    17:49:51   error: command '/opt/h2oai/h2o4gpu/python/bin/python' failed with exit status 1
    17:49:51   
    17:49:51   ----------------------------------------
    17:49:51   Failed building wheel for llvmlite

What are your thoughts on approach this? Can we merge this PR for x86_64 build ?

mdymczyk commented 6 years ago

@hemenkapadia please don't merge anything that fails (either ppc64le or X86_64).

If it's too much work and we don't need it immediately (or at all) then I'd say we should just skip it during ppc64le build and only build it for x86_64.

However that version does not seem to be compatible with the LLVM version that is downloaded from artifacts.h2o.ai as part of the docker build step

Can we try removing that LLVM stuff from the docker and see if it still works?

pseudotensor commented 6 years ago

Ya, definitely don't merge if not green. But as we discussed, I don't see why llvm is needed, so remove it unless I'm confused.

hemenkapadia commented 6 years ago

All checks passed now. We discussed on PR#678 that llvm is needed for llvmlite which is a dependency for numba. Since all checks are passing if you are ok please approve this PR so that I can merge to dev