Closed oschulz closed 2 days ago
I'm using a
.julia/environments/v1.10/LocalPreferences.toml
with[CUDA_Runtime_jll] local = "true" version = "local"
That only works if CUDA_Runtime_jll
is part of your environment, which is what the set_runtime_version!
call does (adding it to the [extras]
section).
You didn't mention which version of CUDA.jl you're using. The latest version should support Tegra devices out of the box, i.e., without the need for a local toolkit.
Installing CUDA.jl results in lots of errors like
****NvRmMemInit failed**** error type: 196626
Errors like this indicate that your user doesn't have sufficient permissions. At the very least, you need to be in the video
group.
That only works if CUDA_Runtime_jll is part of your environment,
Oops, my bad!
Errors like this indicate that your user doesn't have sufficient permissions. At the very least, you need to be in the video group.
Thanks for the hint, that was it.
In general, would you recommend to use preinstalled CUDA, or the latest Julia-installed CUDA on such systems?
would you recommend to use preinstalled CUDA, or the latest Julia-installed CUDA on such systems
I would always recommend using the Julia-installed one. That ensures both driver<->toolkit, as well as toolkit<->CUDA.jl compatibility. Here, for example, we should be able to use CUDA toolkit 11.8, instead of the 11.4 you provided.
Thanks again Tim!
I just tried CUDA.jl on an NVIDIA Jetson Orin NX, but installing CUDA.jl results in errors and CUDA.jl doesn't work. This is with a fresh, "empty " Julla v1.10.4 install:
I'm using a
.julia/environments/v1.10/LocalPreferences.toml
withbut
add CUDA
still results in a download of the 1.6 GB CUDA runtime artifact. So while the Jetson system comes with/usr/local/cuda-11.4/
preinstalled, I don't think CUDA.jl tries to use it, despite theLocalPreferences.toml
.Installing CUDA.jl results in lots of errors like
and
using CUDA
afterwards results in the same kind of errors. I've used Julia with CUDA successfully on a Jetson TX2, a Jetson Nano and a Jetson Xavier NX in the past, always worked out-of-the-box.