Open Robadob opened 1 year ago
Via LD_DEBUG=libs python3 -c "import pyflamegpu" 2> ld.log
have tested that this causes pyflamegpu to load libs from conda if present. (Removal of rpath and static from --verbose
link command was observed too)
1054040: find library=libnvrtc.so.12 [0]; searching
1054040: search path=/home/rob/miniconda3/envs/py38/bin/../lib (RPATH from file python3)
1054040: trying file=/home/rob/miniconda3/envs/py38/bin/../lib/libnvrtc.so.12
1054040:
1054040: find library=libcuda.so.1 [0]; searching
1054040: search path=/home/rob/miniconda3/envs/py38/bin/../lib (RPATH from file python3)
1054040: trying file=/home/rob/miniconda3/envs/py38/bin/../lib/libcuda.so.1
1054040: search path=/usr/local/cuda-12.2/lib64:tls/haswell/x86_64:tls/haswell:tls/x86_64:tls:haswell/x86_64:haswell:x86_64: (LD_LIBRARY_PATH)
1054040: trying file=/usr/local/cuda-12.2/lib64/libcuda.so.1
1054040: trying file=tls/haswell/x86_64/libcuda.so.1
1054040: trying file=tls/haswell/libcuda.so.1
1054040: trying file=tls/x86_64/libcuda.so.1
1054040: trying file=tls/libcuda.so.1
1054040: trying file=haswell/x86_64/libcuda.so.1
1054040: trying file=haswell/libcuda.so.1
1054040: trying file=x86_64/libcuda.so.1
1054040: trying file=libcuda.so.1
1054040: search cache=/etc/ld.so.cache
1054040: trying file=/lib/x86_64-linux-gnu/libcuda.so.1
1054040:
1054040: find library=libcudart.so.12 [0]; searching
1054040: search path=/home/rob/miniconda3/envs/py38/bin/../lib (RPATH from file python3)
1054040: trying file=/home/rob/miniconda3/envs/py38/bin/../lib/libcudart.so.12
Note libcuda.so
is always provided by the driver.
Conda post-build log of first successful package build. https://gist.github.com/Robadob/9b5f33c377a8bb45793058725769d8d5
Warning: rpath /home/rob/miniconda3/envs/py311/conda-bld/pyflamegpu_1698661724842/_build_env/lib is outside prefix /home/rob/miniconda3/envs/py311/conda-bld/pyflamegpu_1698661724842/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_p (removing it)
CMake isn't removing all RPaths?
Disabling the CMake rpath command did not change this warning, nor add anything related (assuming i updated git right)
Possible we can force an error for these warnings with --error-overlinking
WARNING (pyflamegpu): interpreter (Python) package conda-forge::python-3.11.6-hab00c5b_0_cpython in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (pyflamegpu): interpreted library (Python) package conda-forge::astpretty-2.1.0-pyhd8ed1ab_0 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
No idea how to resolve these, we know them to be required but for whatever reason it thinks otherwise.
The suggested build/ignore_run_exports
seems to be documented(1, 2), that it instead prevents dependencies of the named requirements from having their dependencies required too?? Which may not be what we want.
build:
ignore_run_exports:
- python # Not clear why conda thinks this isn't used
- astpretty # Not clear why conda thinks this isn't used
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: conda-forge/linux-64::cuda-nsight-12.0.78-ha770c72_0, conda-forge/linux-64::cuda-cuobjdump-12.0.76-h59595ed_0, conda-forge/linux-64::cuda-cuxxfilt-12.0.76-h59595ed_0, conda-forge/linux-64::cuda-nvprune-12.0.76-h59595ed_0, conda-forge/linux-64::cuda-sanitizer-api-12.0.90-h59595ed_0, conda-forge/linux-64::cuda-nvprof-12.0.90-h59595ed_0, conda-forge/linux-64::cuda-nvvp-12.0.90-h59595ed_0
path: 'LICENSE'
ClobberWarning: Conda was asked to clobber an existing path.
source path: /home/rob/miniconda3/envs/py311/pkgs/cuda-cuobjdump-12.0.76-h59595ed_0/LICENSE
target path: /home/rob/miniconda3/envs/py311/conda-bld/pyflamegpu_1698661724842/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/LICENSE
ClobberWarning: Conda was asked to clobber an existing path.
source path: /home/rob/miniconda3/envs/py311/pkgs/cuda-cuxxfilt-12.0.76-h59595ed_0/LICENSE
target path: /home/rob/miniconda3/envs/py311/conda-bld/pyflamegpu_1698661724842/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/LICENSE
ClobberWarning: Conda was asked to clobber an existing path.
source path: /home/rob/miniconda3/envs/py311/pkgs/cuda-nvprune-12.0.76-h59595ed_0/LICENSE
target path: /home/rob/miniconda3/envs/py311/conda-bld/pyflamegpu_1698661724842/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/LICENSE
ClobberWarning: Conda was asked to clobber an existing path.
source path: /home/rob/miniconda3/envs/py311/pkgs/cuda-sanitizer-api-12.0.90-h59595ed_0/LICENSE
target path: /home/rob/miniconda3/envs/py311/conda-bld/pyflamegpu_1698661724842/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/LICENSE
ClobberWarning: Conda was asked to clobber an existing path.
source path: /home/rob/miniconda3/envs/py311/pkgs/cuda-nvprof-12.0.90-h59595ed_0/LICENSE
target path: /home/rob/miniconda3/envs/py311/conda-bld/pyflamegpu_1698661724842/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/LICENSE
ClobberWarning: Conda was asked to clobber an existing path.
source path: /home/rob/miniconda3/envs/py311/pkgs/cuda-nvvp-12.0.90-h59595ed_0/LICENSE
target path: /home/rob/miniconda3/envs/py311/conda-bld/pyflamegpu_1698661724842/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/LICENSE
Not too sure what clobber is, some kind of path rewrite? This would suggest it's trying to package the license of various cuda packages into our package though which seems wrong. Unable to find a resolution via googling, it currently seems harmless (clobber can cause install failures in bad environments, but that doesn't appear to be the case here)
However, I'm concerned this will make an implicit requirement on the -dev variant of the packages rather than the none-dev versions (which contain the same .so).
-dev might be able to go in build and have the non-dev versions in host / run?
Not sure about the rest of the things to resolve, might just have to be more trial and error.
The currently built linux conda package appears to work, however it's a bit awkward to install it with dependencies.
test_channel/noarch
and place the package's .tar.gz
inside thatconda index test_channel
(requires conda-build)conda install -c file:///home/rob/test_channel pyflamegpu
(abs path may vary ~
does not work)Windows log from first successful conda packaging
https://gist.github.com/Robadob/9e3b7a94991762f76a22d864d727d89a
Missing dlls, both CUDA should probably be provided by driver/CUDA, so will add to DSO whitelist. The VC++ ones appear similar to the python warnings we had, if they're vc++ redist they are required despite whatever conda thinks.
WARNING (pyflamegpu,Lib/site-packages/pyflamegpu/_pyflamegpu.pyd): $RPATH/nvrtc64_120_0.dll not found in packages, sysroot(s) nor the missing_dso_whitelist.
.. is this binary repackaging?
INFO (pyflamegpu,Lib/site-packages/pyflamegpu/_pyflamegpu.pyd): Needed DSO C:/Windows/System32/nvcuda.dll found in $SYSROOT
WARNING (pyflamegpu,Lib/site-packages/pyflamegpu/_pyflamegpu.pyd): $RPATH/cudart64_12.dll not found in packages, sysroot(s) nor the missing_dso_whitelist.
.. is this binary repackaging?
WARNING (pyflamegpu): dso library package conda-forge::vc14_runtime-14.36.32532-hdcecf7f_17 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (pyflamegpu): dso library package conda-forge::ucrt-10.0.22621.0-h57928b3_0 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
Windows vis build now seems to work.
Linux is stuck on CMake finding OpenGL. OpenGL is not a conda package as it's more of a system lib.
This guide suggests that core dependency tree (CDT) for mesa should be used, - {{ cdt('mesa-libgl-devel') }} # [linux]
, this does appear to compile, however this is very old so we would rather use lbglvnd
as mesa may fail to run.
This conda recipe includes it's cdt libglvnd-opengl
. However when we attempt that (on Waimu) it attempts to install the COS6 cdt version which does not exist for any libglvnd
package.
There isn't much documentation about the COS6/7 feature, conda forge states that - sysroot_linux-64 2.17 # [linux64]
will cause COS7 to be used (as also seen in the above example's recipe), however that does not make a difference in our case. Unclear if this is a conda-forge specific build feature.
More research and testing required.
trying to get cos7 to work
cdt cos7 seems alot like a conda-forge feature.
- sysroot_{{ target_platform }} 2.17 # [linux]
or- sysroot_linux-64 2.17 # [linux64]
https://conda-forge.org/docs/maintainer/knowledge_base.html#using-centos-7
Doesn't make a difference, maybe conda-forge specific.
# [cdt_name=='cos7']
https://github.com/conda-forge/cdt-builds/issues/41
Doesn't make a difference, maybe conda-forge specific.
c_stdlib: - sysroot # [linux] c_stdlib_version: - 2.17 # [linux]
- {{ stdlib('c') }}
Doesn't work, adding pkg {{ stdlib('c') }}
or {{ stdlib('cxx') }}
causes an early RegEx failure when conda build
is called.
Shall see how far I can work through this list for publishing on conda-forge this week.
https://github.com/conda-forge/staged-recipes/pull/24483
Working on this here.
Very much work in progress
Changes
FLAMEGPU_BUILD_PYTHON_CONDA
set(CMAKE_SKIP_RPATH TRUE)
set(CMAKE_CUDA_RUNTIME_LIBRARY shared)
placement, so that executables and pyflamegpu link against shared cuda libstodo
--error-overlinking
toconda build
command, to werror overlinking.