Closed bjude closed 2 years ago
It can be fixed by adding some subdirectories to the CUB include paths in cub/cub/cmake/cub-config.cmake:
target_include_directories(_CUB_CUB INTERFACE "${_CUB_INCLUDE_DIR}/cub/block")
target_include_directories(_CUB_CUB INTERFACE "${_CUB_INCLUDE_DIR}/cub/block/specializations")
but that doesnt really feel like the right way to solve the issue.
Interestingly, the issue doesnt manifest if I try and build just CUB in isolation
MSVC 19.28.29331.0
Where did this version of MSVC come from?
I build regularly with MSVC Community 2019, and the updater doesn't report any version newer than 19.27.29112. Everything builds just fine there :-/
Is this a pre-release developer snapshot of MSVC or something?
This was 16.8 preview 4 i believe, im not at that machine currently though. The preview builds dont show up in the standard installer, it think theres a separate installer for them.
I'll install 16.7 and see if I can reproduce it
same error with 19.27.29112.0, cmake 3.18.0
have you added any cmake flags when you build with MSVC? i'm just doing
cmake ..
cmake --build .
from a build
dir in the root of the thrust dir
I just built the failing compilation units with a fresh, no-option* cmake ..
on latest MSVC 2019.
_*I did disable a few SM archs (all but sm75), but that shouldn't affect this.
It's strange. The compiler is finding the CUB headers, since the error is occurring in F:\code\thrust\cub\block\block_exchange.cuh
, which fails at #include "../config.h"
. The only include path CUB needs is already added (-IF:\code\thrust\dependencies\cub
), I'm not sure why MSVC is failing to locate relative paths on your machine.
Some sanity checks:
F:\code\thrust\cub\config.h
exist on your system?It probably wouldn't be a bad idea for us to sweep through the CUB codebase and convert all of the relative paths to full includes relative to the CUB include path, but it should work as-is.
I think I see a possible explanation. Are you using the MSVC developer prompt to configure this, or some other shell?
Here's what's weird:
Both the thrust and cub include paths are added to the nvcc
invocation:
-IF:\code\thrust -IF:\code\thrust\dependencies\cub
However, the file that MSVC is complaining about is
F:\code\thrust\cub\block\block_exchange.cuh
, not
F:\code\thrust\dependencies\cub\cub\block\block_exchange.cuh
This is due to a cub
UNIX-style symlink that currently exists in the root of the Thrust repo. It sounds like some part of your toolchain or environment is picking up the F:\code\thrust\cub
symlink to pull in the cub\block\block_exchange.cuh
header, but then the preprocessor is getting confused because it sees the F:\code\thrust\cub
unix symlink as a file, and the path F:\code\thrust\cub\block\..\config.cuh
simply doesn't make sense to native windows tools.
I'm trying to get rid of that symlink because it's a timebomb in a cross-platform project, and it looks like you found a way to make it explode. Issue #1283 is tracking the effort to remove it in a way that won't break anyone's existing build.
As a work-around, try removing the F:\code\thrust\cub
file/symlink and see if your build finishes.
cc: @brycelelbach for visibility -- symlinks are bad.
Updated title to improve google-ability of the underlying issue. Others may hit this before we get it fixed.
Yep thats it! Deleting the symlink fixes the build.
What problem did the symlink solve? I imagine the CUB include dirs could be handled by the CUB cmake target include directories
For reference I tried building from the MSVC developer prompt and from VS Code (which i'm pretty sure uses the vcvars.bat from the dev prompt under the hood).
What problem did the symlink solve? I imagine the CUB include dirs could be handled by the CUB cmake target include directories
Some people wanted the simplicity of just doing -I/path/to/thrust/checkout/
, but they only tested on linux. Now that several years have passed, we worry that removing the symlink will break downstream builds. We're considering a monorepo for thrust/cub anyway, so we're going to punt on fixing the symlink until we decide on a path to the monorepo, which will eliminate the symlink problem.
For reference I tried building from the MSVC developer prompt and from VS Code (which i'm pretty sure uses the vcvars.bat from the dev prompt under the hood).
Ah, I thought maybe you were using some WSL or git bash setup that would understand the unix symlink and generate the bad path before it got to MSVC's preprocessor and choked.
Is this breaking / blocking your projects, or is deleting the symlink a reasonable workaround until we get the monorepo?
Its not really blocking any projects, i only tend to use the github branch for thrust to contribute to thrust, not for real work. Deleting the symlink and disabling warnings-as-errors is a fine workaround for now
FWIW, a monorepo shouldnt be necessary if the relative includes are removed and cmake projects are set up properly. obviously that could break people to -I/path/to/thrust
but we dont want to be supporting such bad behaviour do we :P
The monorepo is to address many issues, not just the symlink. Thrust and CUB are version locked, tightly coupled, and share a lot of build infrastructure. They're effectively two layers of abstraction providing the same functionality, so it makes sense to combine the code bases.
FWIW, a monorepo shouldnt be necessary if the relative includes are removed and cmake projects are set up properly.
I'm not sure what you mean here, the relative includes internal to each project should be fine, and we have a new CMake config that seems to be working fairly well for most users. Have you had other issues related to these?
I have a same issue when using thrust with CMake on GitHub Action windows-2019 virtual machine.
Logs:
2021-07-23T12:20:44.0344123Z ##[group]Run $trimeshDir = "..\trimesh2-build"
2021-07-23T12:20:44.0344888Z [36;1m$trimeshDir = "..\trimesh2-build"[0m
2021-07-23T12:20:44.0345317Z [36;1mcmake -A x64 `[0m
2021-07-23T12:20:44.0345924Z [36;1m-DCMAKE_TOOLCHAIN_FILE:FILEPATH="C:\vcpkg\scripts\buildsystems\vcpkg.cmake" `[0m
2021-07-23T12:20:44.0346693Z [36;1m-DTrimesh2_INCLUDE_DIR:PATH="$trimeshDir\include" `[0m
2021-07-23T12:20:44.0347349Z [36;1m-DTrimesh2_LINK_DIR:PATH="$trimeshDir\lib.Win64.vs142" `[0m
2021-07-23T12:20:44.0347944Z [36;1m-DCUDA_ARCH:STRING=$env:CUDA_ARCH `[0m
2021-07-23T12:20:44.0348498Z [36;1m-DCMAKE_BUILD_TYPE=Release `[0m
2021-07-23T12:20:44.0349098Z [36;1m-DThrust_DIR=D:\a\cuda_voxelizer\cuda_voxelizer\..\thrust-repo\thrust\cmake `[0m
2021-07-23T12:20:44.0349646Z [36;1m-S . -B .\build[0m
2021-07-23T12:20:44.0392387Z shell: C:\Program Files\PowerShell\7\pwsh.EXE -command ". '{0}'"
2021-07-23T12:20:44.0392888Z env:
2021-07-23T12:20:44.0393266Z CUDA_MAJOR_VERSION: 11.3
2021-07-23T12:20:44.0393740Z CUDA_PATCH_VERSION: 1
2021-07-23T12:20:44.0394157Z TRIMESH_VERSION: 2020.03.04
2021-07-23T12:20:44.0394513Z CUDAARCHS: 60
2021-07-23T12:20:44.0394922Z NVIDIA_TRUST_VERSION: cuda-11.3
2021-07-23T12:20:44.0395478Z CUDA_PATH: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3
2021-07-23T12:20:44.0396184Z CUDA_PATH_V11_3: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3
2021-07-23T12:20:44.0396738Z CUDA_PATH_VX_Y: CUDA_PATH_V11_3
2021-07-23T12:20:44.0397107Z ##[endgroup]
2021-07-23T12:20:44.4055037Z -- Building for: Visual Studio 16 2019
2021-07-23T12:20:56.5812233Z -- The CXX compiler identification is MSVC 19.29.30038.1
2021-07-23T12:21:02.5776535Z -- The CUDA compiler identification is NVIDIA 11.3.109
2021-07-23T12:21:02.6732984Z -- Detecting CXX compiler ABI info
2021-07-23T12:21:05.0103549Z -- Detecting CXX compiler ABI info - done
2021-07-23T12:21:05.0127822Z -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Enterprise/VC/Tools/MSVC/14.29.30037/bin/Hostx64/x64/cl.exe - skipped
2021-07-23T12:21:05.0133657Z -- Detecting CXX compile features
2021-07-23T12:21:05.0160687Z -- Detecting CXX compile features - done
2021-07-23T12:21:05.0333761Z -- Detecting CUDA compiler ABI info
2021-07-23T12:21:08.0203734Z -- Detecting CUDA compiler ABI info - done
2021-07-23T12:21:08.0539020Z -- Check for working CUDA compiler: C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v11.3/bin/nvcc.exe - skipped
2021-07-23T12:21:08.0543309Z -- Detecting CUDA compile features
2021-07-23T12:21:08.0549805Z -- Detecting CUDA compile features - done
2021-07-23T12:21:11.0095399Z -- Found OpenMP_CXX: -openmp (found version "2.0")
2021-07-23T12:21:11.0100286Z -- Found OpenMP: TRUE (found version "2.0")
2021-07-23T12:21:11.0831989Z -- Found CUDAToolkit: C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v11.3/include (found version "11.3.109")
2021-07-23T12:21:15.4678404Z -- Found Trimesh2 include: D:/a/cuda_voxelizer/trimesh2-build/include/TriMesh.h
2021-07-23T12:21:15.4679974Z -- Found Trimesh2 lib: D:/a/cuda_voxelizer/trimesh2-build/lib.Win64.vs142/trimesh.lib
2021-07-23T12:21:15.4829470Z -- Found Thrust: D:/a/cuda_voxelizer/thrust-repo/thrust/cmake/thrust-config.cmake (found version "1.11.0.0")
2021-07-23T12:21:15.4888228Z -- Found CUB: D:/a/cuda_voxelizer/thrust-repo/dependencies/cub/cub/cmake/cub-config.cmake (found version "1.11.0.0")
2021-07-23T12:21:15.4911045Z -- Configuring done
2021-07-23T12:21:15.5498164Z -- Generating done
2021-07-23T12:21:15.5505602Z CMake Warning:
2021-07-23T12:21:15.5506426Z Manually-specified variables were not used by the project:
2021-07-23T12:21:15.5507059Z
2021-07-23T12:21:15.5507554Z CUDA_ARCH
2021-07-23T12:21:15.5507929Z
2021-07-23T12:21:15.5508271Z
2021-07-23T12:21:15.5516155Z -- Build files have been written to: D:/a/cuda_voxelizer/cuda_voxelizer/build
2021-07-23T12:21:15.7235016Z ##[group]Run cmake --build .\build --parallel 2 --target ALL_BUILD --config Release
2021-07-23T12:21:15.7235916Z [36;1mcmake --build .\build --parallel 2 --target ALL_BUILD --config Release[0m
2021-07-23T12:21:15.7276617Z shell: C:\Program Files\PowerShell\7\pwsh.EXE -command ". '{0}'"
2021-07-23T12:21:15.7277144Z env:
2021-07-23T12:21:15.7277514Z CUDA_MAJOR_VERSION: 11.3
2021-07-23T12:21:15.7277954Z CUDA_PATCH_VERSION: 1
2021-07-23T12:21:15.7278332Z TRIMESH_VERSION: 2020.03.04
2021-07-23T12:21:15.7278728Z CUDAARCHS: 60
2021-07-23T12:21:15.7279157Z NVIDIA_TRUST_VERSION: cuda-11.3
2021-07-23T12:21:15.7279712Z CUDA_PATH: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3
2021-07-23T12:21:15.7280408Z CUDA_PATH_V11_3: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3
2021-07-23T12:21:15.7280974Z CUDA_PATH_VX_Y: CUDA_PATH_V11_3
2021-07-23T12:21:15.7281362Z ##[endgroup]
2021-07-23T12:21:16.1803944Z Microsoft (R) Build Engine version 16.10.2+857e5a733 for .NET Framework
2021-07-23T12:21:16.1805376Z Copyright (C) Microsoft Corporation. All rights reserved.
2021-07-23T12:21:16.1805954Z
2021-07-23T12:21:16.6176867Z Checking Build System
2021-07-23T12:21:16.7999109Z Building Custom Rule D:/a/cuda_voxelizer/cuda_voxelizer/CMakeLists.txt
2021-07-23T12:21:17.6273841Z Compiling CUDA source file ..\src\voxelize.cu...
2021-07-23T12:21:18.7586958Z
2021-07-23T12:21:18.7608519Z D:\a\cuda_voxelizer\cuda_voxelizer\build>"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\bin\nvcc.exe" -gencode=arch=compute_60,code=\"compute_60,compute_60\" -gencode=arch=compute_60,code=\"sm_60,compute_60\" --use-local-env -ccbin "C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\VC\Tools\MSVC\14.29.30037\bin\HostX64\x64" -x cu -I"D:\a\cuda_voxelizer\cuda_voxelizer\..\trimesh2-build\include" -I"D:\a\cuda_voxelizer\thrust-repo" -I"D:\a\cuda_voxelizer\thrust-repo\dependencies\cub" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\include" -I"C:\vcpkg\installed\x64-windows\include" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\include" --keep-dir x64\Release -maxrregcount=0 --machine 64 --compile -cudart static -std=c++17 -Xcompiler="/EHsc -Ob2" -D_WINDOWS -DNDEBUG -DTHRUST_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -D"CMAKE_INTDIR=\"Release\"" -D_MBCS -DWIN32 -D_WINDOWS -DNDEBUG -DTHRUST_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -D"CMAKE_INTDIR=\"Release\"" -Xcompiler "/EHsc /W1 /nologo /O2 /Fdcuda_voxelizer.dir\Release\vc142.pdb /FS /MD " -o cuda_voxelizer.dir\Release\voxelize.obj "D:\a\cuda_voxelizer\cuda_voxelizer\src\voxelize.cu"
2021-07-23T12:21:18.7613542Z Compiling CUDA source file ..\src\thrust_operations.cu...
2021-07-23T12:21:18.8483861Z
2021-07-23T12:21:18.8501921Z D:\a\cuda_voxelizer\cuda_voxelizer\build>"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\bin\nvcc.exe" -gencode=arch=compute_60,code=\"compute_60,compute_60\" -gencode=arch=compute_60,code=\"sm_60,compute_60\" --use-local-env -ccbin "C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\VC\Tools\MSVC\14.29.30037\bin\HostX64\x64" -x cu -I"D:\a\cuda_voxelizer\cuda_voxelizer\..\trimesh2-build\include" -I"D:\a\cuda_voxelizer\thrust-repo" -I"D:\a\cuda_voxelizer\thrust-repo\dependencies\cub" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\include" -I"C:\vcpkg\installed\x64-windows\include" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.3\include" --keep-dir x64\Release -maxrregcount=0 --machine 64 --compile -cudart static -std=c++17 -Xcompiler="/EHsc -Ob2" -D_WINDOWS -DNDEBUG -DTHRUST_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -D"CMAKE_INTDIR=\"Release\"" -D_MBCS -DWIN32 -D_WINDOWS -DNDEBUG -DTHRUST_HOST_SYSTEM=THRUST_HOST_SYSTEM_CPP -DTHRUST_DEVICE_SYSTEM=THRUST_DEVICE_SYSTEM_CUDA -D"CMAKE_INTDIR=\"Release\"" -Xcompiler "/EHsc /W1 /nologo /O2 /Fdcuda_voxelizer.dir\Release\vc142.pdb /FS /MD " -o cuda_voxelizer.dir\Release\thrust_operations.obj "D:\a\cuda_voxelizer\cuda_voxelizer\src\thrust_operations.cu"
2021-07-23T12:21:18.8531531Z D:\a\cuda_voxelizer\thrust-repo\cub\block\block_exchange.cuh(36): fatal error C1083: Cannot open include file: '../config.cuh': No such file or directory [D:\a\cuda_voxelizer\cuda_voxelizer\build\cuda_voxelizer.vcxproj]
2021-07-23T12:21:18.8564925Z thrust_operations.cu
This will no longer be an issue after Thrust 2.0 🎉
With a freshly cloned thrust (and recursively cloned CUB) i'm getting 'file not found' errors in CUB:
Here is the configure output: