lion03 / thrust

Automatically exported from code.google.com/p/thrust
Apache License 2.0
0 stars 0 forks source link

Cuda 4.0 compile error: kernel launches from templates are not allowed in system files #359

Closed GoogleCodeExporter closed 8 years ago

GoogleCodeExporter commented 8 years ago
Please post a short, self-contained code sample which reproduces the
problem:
~/NVIDIA_GPU_Computing_SDK/C$ make

What is the expected output? What do you see instead?
After successfully installing Cuda 4.0 and the 4.0 SDK on Ubuntu 11.04, and 
then running make for the SDK as above, I get the following output:

make[1]: Entering directory 
`/home/ely/NVIDIA_GPU_Computing_SDK/C/src/radixSortThrust'
/usr/local/cuda/include/thrust/detail/device/cuda/detail/launch_closure.inl(84):
 error: kernel launches from templates are not allowed in system files

/usr/local/cuda/include/thrust/detail/device/cuda/detail/launch_closure.inl(118)
: error: kernel launches from templates are not allowed in system files

/usr/local/cuda/include/thrust/detail/device/cuda/detail/fast_scan.inl(410): 
error: kernel launches from templates are not allowed in system files

/usr/local/cuda/include/thrust/detail/device/cuda/detail/fast_scan.inl(420): 
error: kernel launches from templates are not allowed in system files

... etc (there are about 30 or so of these error lines printed).

What version of Thrust are you using? Which version of nvcc?  Which host
compiler?  On what operating system?  Which GPU?

I have version 1.4.0 of Thrust, Version 4.0 of NVCC, GCC/G++ 4.4.5, Ubuntu 
11.04, and Nvidia Quadro 5000 GPU.

Please provide any additional information below.

Original issue reported on code.google.com by spear...@gmail.com on 22 Aug 2011 at 9:13

GoogleCodeExporter commented 8 years ago
I don't believe this is a Thrust bug, but for some reason nvcc is confusing the 
path /usr/local/cuda/include with a "system file".  Kernel launches from 
templates are not allowed in system files, apparently.

Do you have any idea why nvcc might consider /usr/local/cuda/include a system 
location on your system?  What is the output of 'env' ?

Original comment by jaredhoberock on 23 Aug 2011 at 8:50

GoogleCodeExporter commented 8 years ago
I've followed up with the compiler team, and received this reply:

This could be a system preprocessor or cudafe bug. Cudafe marks a file as a 
"system" file, based on the flag values for the "#line" directive in the 
generated pre-processor output [1].  So it may be that either the preprocessor 
is wrongly marking certain files as "system" files or perhaps cudafe is 
misinterpreting the preprocessor output.

If you can repro, could you attach all the intermediate files generated by "-v 
-keep" options, as well as the verbose nvcc output?  - I'll take a look. Also 
useful to know: is this intermittent, or does it always occur?

If possible, could you add -v -keep to your nvcc command line and respond with 
the result?

[1] http://gcc.gnu.org/onlinedocs/cpp/Preprocessor-Output.html

Original comment by jaredhoberock on 23 Aug 2011 at 11:33

GoogleCodeExporter commented 8 years ago
Here is the output of env:

ely@zaffpants:~/NVIDIA_GPU_Computing_SDK$ env
CPLUS_INCLUDE_PATH=/usr/local/cuda/include
ORBIT_SOCKETDIR=/tmp/orbit-ely
SSH_AGENT_PID=3507
SHELL=/bin/bash
TERM=xterm
XDG_SESSION_COOKIE=f552f4a92ccf8e4f636987a74ca9f001-1314044268.739691-232134353
LIBRARY_PATH=/usr/lib/nvidia-current
WINDOWID=65011742
GNOME_KEYRING_CONTROL=/tmp/keyring-LMBe5X
OLDPWD=/usr/local/cuda
GTK_MODULES=canberra-gtk-module
USER=ely
LD_LIBRARY_PATH=/usr/local/cuda/lib64:
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd
=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=0
1;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.
tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz
=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.d
eb=01;31:*.rpm=01;31:*.jar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;3
1:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=0
1;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.t
iff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;
35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.ogm=01;35:*.mp4=01;35:*.m4
v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:
*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;
35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=
01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.
mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;3
6:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:
SSH_AUTH_SOCK=/tmp/keyring-LMBe5X/ssh
SESSION_MANAGER=local/zaffpants:@/tmp/.ICE-unix/3470,unix/zaffpants:/tmp/.ICE-un
ix/3470
USERNAME=ely
DEFAULTS_PATH=/usr/share/gconf/gnome-classic.default.path
XDG_CONFIG_DIRS=/etc/xdg/xdg-gnome-classic:/etc/xdg
PATH=/home/ely/.local/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/us
r/sbin:/usr/bin:/sbin:/bin:/usr/games
DESKTOP_SESSION=gnome-classic
PWD=/home/ely/NVIDIA_GPU_Computing_SDK
GDM_KEYBOARD_LAYOUT=us
LANG=en_US.UTF-8
GNOME_KEYRING_PID=3451
GDM_LANG=en_US.utf8
MANDATORY_PATH=/usr/share/gconf/gnome-classic.mandatory.path
UBUNTU_MENUPROXY=libappmenu.so
GDMSESSION=gnome-classic
SPEECHD_PORT=7560
SHLVL=1
HOME=/home/ely
LANGUAGE=en_US:en
GNOME_DESKTOP_SESSION_ID=this-is-deprecated
PYTHONPATH=/usr/local/lib/python2.6/site-packages/mahotas-0.6.5-py2.7.egg-info:/
usr/local/lib/python2.6/site-packages/mahotas:/usr/local/lib/python2.6/site-pack
ages/cv.so:/usr/local/lib/python2.6/site-packages:/home/ely/OpenCV-2.2.0/lib:
LOGNAME=ely
XDG_DATA_DIRS=/usr/share/gnome-classic:/usr/share/gnome:/usr/local/share/:/usr/s
hare/
DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-9jZzll5a0L,guid=1cf06b7b9f1f949
f1eefd08c0000031c
LESSOPEN=| /usr/bin/lesspipe %s
WINDOWPATH=8
DISPLAY=:0.0
LESSCLOSE=/usr/bin/lesspipe %s %s
XAUTHORITY=/var/run/gdm/auth-for-ely-v4HfMJ/database
COLORTERM=gnome-terminal
_=/usr/bin/env

Original comment by spear...@gmail.com on 24 Aug 2011 at 3:42

GoogleCodeExporter commented 8 years ago
I apologize, but I am a bit of a novice with nvcc. When you say to add -v 
-keep, do you mean that I should edit some of the makefiles included with the 
Nvidia SDK? What's happening here is that I install Cuda correctly, and the 
SDK, then try to make the examples in the SDK/C subdirectory, and that make 
step causes the error above.

This error only happens with Cuda 4.0, but it happens every time with Cuda 4.0 
and I have also tried it on my home Ubuntu 11 system in addition to this work 
system. When I go through the steps to install Cuda 3.2 or 3.1, this error does 
not occur. However, my lab mate who sits next to me also has a Ubuntu 11.04 
system, with the exact same software that I have, and he had no trouble 
installing Cuda 4.0, which makes me inclined to believe it's not a Cuda error, 
but rather some sort of flags error for me. The problem is, I can't get any 
information about how to resolve it.

Let me know exactly what commands I should be changing to include the -v -keep 
options to NVCC and I'll paste the output.

Original comment by spear...@gmail.com on 24 Aug 2011 at 3:47

GoogleCodeExporter commented 8 years ago
Try removing the variable CPLUS_INCLUDE_PATH from your environment.  This may 
be why nvcc is confusing the contents of /usr/local/cuda/include for system 
files.  A quick check would be to do

$ export CPLUS_INCLUDE_PATH=

and then try the build as you normally would.

Original comment by jaredhoberock on 24 Aug 2011 at 6:01

GoogleCodeExporter commented 8 years ago
This worked and now the make process works just fine for all of the included 
demos in the SDK. I'm not sure why CPLUS_INCLUDE_PATH is set to include 
/usr/local/cuda/include, or why nvcc doesn't like this, but at least this 
offers a fix. Thank you very much.

Original comment by spear...@gmail.com on 24 Aug 2011 at 6:40