Open 0x0f0f0f opened 3 years ago
It seems like building AMD's LLVM fork is necessary to build the ROCM Runtime.
Related: #19507 #21153
I'd love to have this packaged, but it looks complex af 😢
I found this repository to build ROCm: https://github.com/xuhuisheng/rocm-build. I forked it here and made some changes to make it work on Void: https://github.com/Nicop06/rocm-build. The only thing missing is figuring out the dependencies, which I manually installed.
here is my failed attempt at packaging this AMD garbage. Just as an reference. Based on Arch work.
If anyone is still interested in this, I also tried to package ROCm/HIP for void here. I only tested/built it for x86_64 gnu libc. I've also included package options for each GPU architecture which you should set using XBPS_PKG_OPTIONS in the etc/conf file if you don't want to build for all of them. There is also a HIP-sdk
meta package which installs everything.
@rederick29 any plans to support rocblas?
@rederick29 any plans to support rocblas?
I already have a rocBLAS package in my repo. I've tested it and it works just fine for me. If you have any issues with it or have any questions please open an issue on my fork.
Well, for those who, like me, want to build, I will leave a hint just in case. I warn you that this will take a lot of time (a day) and require a lot of vram+swap(up to 60GB)
sudo xbps-install -Su
git clone https://github.com/rederick29/void-packages rocm
cd rocm
git checkout rocm
git remote add upstream https://github.com/void-linux/void-packages
git fetch upstream master
git rebase upstream/master
./xbps-src binary-bootstrap
# REMOVE the '~' (tilde) from the GPU architecture(s) you wish to build for
echo "XBPS_PKG_OPTIONS=~gfx803,~gfx900,~gfx906,~gfx908,~gfx90a,~gfx1010,~gfx1011,~gfx1012,~gfx1030,~gfx1031,~gfx1100,~gfx1101,~gfx1102" >> etc/conf
./xbps-src pkg HIP-sdk
UPD: Faster way:
git clone https://github.com/rederick29/void-packages rocm --depth 1 -b rocm
git clone --depth 1 https://github.com/void-linux/void-packages
cp -r rocm/srcpkgs/{FunctionalPlus,HIP-sdk,HIPIFY,MIOpen,MIOpenGEMM,ROCR-Runtime,ROCT-Thunk-Interface,ROCclr,ROCdbgapi,ROCgdb,ROCm-CompilerSupport,ROCm-Device-Libs,ROCm-OpenCL-Runtime,ROCm-composable_kernel,ROCm-core,frugally-deep,half,hipBLAS,hipCUB,hipFFT,hipSOLVER,hipSPARSE,hipamd,hsa-amd-aqlprofile-bin,magma,python3-CppHeaderParser,python3-barectf,rccl,rocALUTION,rocBLAS,rocFFT,rocMLIR,rocPRIM,rocRAND,rocSOLVER,rocSPARSE,rocThrust,rocm-cmake,rocm-flang-pgmath,rocm-flang,rocm-llvm-openmp,rocm-llvm,rocm_smi_lib,rocminfo/,rocprofiler,rocprofiler-v2,roctracer} ./void-packages/srcpkgs/
# Go to https://github.com/void-linux/void-packages/compare/master...rederick29:void-packages:rocm
# Copy additions from common/shlibs and paste at end of ./void-packages/common/shlibs
cd void-packages
./xbps-src binary-bootstrap
./xbps-src pkg rocminfo
sudo xi rocminfo
# Get GPU architecture name:
/opt/rocm/bin/rocminfo
# REMOVE the '~' (tilde) from the GPU architecture(s) you wish to build for:
"XBPS_PKG_OPTIONS=~gfx803,~gfx900,~gfx906,~gfx908,~gfx90a,~gfx1010,~gfx1011,~gfx1012,~gfx1030,~gfx1031,~gfx1100,~gfx1101,~gfx1102" >> etc/conf
# Build what you want. For all packages:
./xbps-src pkg HIP-sdk
UPD2: ROCm 6.0:
git clone https://github.com/rederick29/void-packages rocm --depth 1 -b rocm6
git clone --depth 1 https://github.com/void-linux/void-packages
cp -r rocm/srcpkgs/{FunctionalPlus,HIP-sdk,HIPIFY,hipRAND,MIOpen,MIOpenGEMM,ROCR-Runtime,ROCT-Thunk-Interface,ROCclr,ROCdbgapi,ROCgdb,ROCm-CompilerSupport,ROCm-Device-Libs,ROCm-OpenCL-Runtime,ROCm-composable_kernel,ROCm-core,frugally-deep,rocm-half,hipBLAS,hipCUB,hipFFT,hipSOLVER,hipSPARSE,hipamd,hsa-amd-aqlprofile-bin,magma,python3-CppHeaderParser,python3-barectf,rccl,rocALUTION,rocBLAS,rocFFT,rocMLIR,rocPRIM,rocRAND,rocSOLVER,rocSPARSE,rocThrust,rocm-cmake,rocm-flang-pgmath,rocm-flang,rocm-llvm-openmp,rocm-llvm,rocm_smi_lib,rocminfo/,rocprofiler,roctracer} ./void-packages/srcpkgs/
# Go to https://github.com/void-linux/void-packages/compare/master...rederick29:void-packages:rocm
# Copy additions from common/shlibs and paste at end of ./void-packages/common/shlibs
cd void-packages
./xbps-src binary-bootstrap
./xbps-src pkg rocminfo
sudo xi rocminfo
# Get GPU architecture name:
/opt/rocm/bin/rocminfo
# REMOVE the '~' (tilde) from the GPU architecture(s) you wish to build for:
echo "XBPS_PKG_OPTIONS=~gfx803,~gfx900,~gfx906,~gfx908,~gfx90a,~gfx940,~gfx941,~gfx942,~gfx1010,~gfx1011,~gfx1012,~gfx1030,~gfx1031,~gfx1100,~gfx1101,~gfx1102" >> etc/conf
# Build what you want. For most packages:
./xbps-src pkg HIP-sdk
sudo xi HIP-sdk
# The packages "hipBLASLt" and "rccl" are not built and installed by this package. They are not commonly used as part of HIP, so if you need them install separately
I tried building this and it was going great until my cpu died... guess it was too much. I really hope we can get this, I really don't want to dual boot
depending on what you're doing you could use docker
@itsdeadguy I think I can provide you with binary packages for gfx906
https://rocmdocs.amd.com/en/latest/Installation_Guide/List-of-ROCm-Packages-for-Ubuntu-Fedora.html https://github.com/RadeonOpenCompute/ROCm
Trying to build julia support for AMD GPUs on ROCm :)