Closed ldeso closed 1 year ago
Merging #166 (61801da) into master (a7d6f7d) will decrease coverage by
0.62%
. The diff coverage is0.00%
.
@@ Coverage Diff @@
## master #166 +/- ##
==========================================
- Coverage 77.13% 76.50% -0.63%
==========================================
Files 20 20
Lines 608 613 +5
==========================================
Hits 469 469
- Misses 139 144 +5
Impacted Files | Coverage Δ | |
---|---|---|
src/compat/gpuarrays.jl | 65.78% <0.00%> (-9.97%) |
:arrow_down: |
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
Cool, thanks!
Currently constructing ComponentArrays on the GPU with for example
ComponentArray(a=CUDA.ones(2,3), b=CUDA.ones(4,5))
fails with an error.This fails because constructing the ComponentArray causes the creation of an array of arrays, which is not possible on the GPU. This happens in the
pushcat!
call at line 179 in componentarray.jl:https://github.com/jonniedie/ComponentArrays.jl/blob/a7d6f7d126c5aeb711b65cf74eff8f8cbbeca5d0/src/componentarray.jl#L178-L182
This pull request fixes the issue by constructing the ComponentArray on the CPU and moving it back to the GPU afterwards.
Fixes #158. Needed for SciML/NeuralPDE.jl#594.