LLNL / RAJA

RAJA Performance Portability Layer (C++)
BSD 3-Clause "New" or "Revised" License
490 stars 103 forks source link

Linking compatability between different RAJA configs #1171

Open MrBurmark opened 2 years ago

MrBurmark commented 2 years ago

I just ran into an issue where I accidentally used a library built with RAJA configured without Cuda and linked it to a code built with RAJA configured with Cuda. Surprisingly this built and linked without complaint and failed at runtime when the library tried to use device memory in a sequential loop. RAJA configured with Cuda already allows usage without a cuda compiler. Should we find a way to only allow one RAJA configuration per build? Maybe adding a symbol that represents the RAJA configuration so accidentally building with two differently configured RAJA installs will fail.

rhornung67 commented 2 years ago

I think that's a good idea since it would prevent user errors.