JuliaGPU / Adapt.jl

Other
86 stars 24 forks source link

Revert "Implement missing LinearAlgebra wrappers and add support for uplo parameter" #70

Closed maleadt closed 8 months ago

maleadt commented 8 months ago

Reverts JuliaGPU/Adapt.jl#51. This breaks CUDA.jl:

julia> using CUDA
Precompiling CUDA
  6 dependencies successfully precompiled in 20 seconds. 60 already precompiled.

julia> A = CUDA.rand(2,2)
2×2 CuArray{Float32, 2, CUDA.Mem.DeviceBuffer}:
 0.523348  0.920557
 0.283318  0.778918

julia> using LinearAlgebra

julia> cholesky(Hermitian(A*A'+I, :L), NoPivot())
ERROR: a exception was thrown during kernel execution.
Stacktrace:
 [1] setindex! at /cache/build/default-amdci5-7/julialang/julia-release-1-dot-10/usr/share/julia/stdlib/v1.10/LinearAlgebra/src/symmetric.jl:264
 [2] _setindex! at ./abstractarray.jl:1424
 [3] setindex! at ./abstractarray.jl:1389
 [4] linear_copy_kernel! at /home/tim/.julia/packages/GPUArrays/EZkix/src/host/abstractarray.jl:180
codecov[bot] commented 8 months ago

Codecov Report

Attention: 7 lines in your changes are missing coverage. Please review.

Comparison is base (4f8952b) 77.77% compared to head (6cd90b0) 83.33%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #70 +/- ## ========================================== + Coverage 77.77% 83.33% +5.55% ========================================== Files 6 6 Lines 81 72 -9 ========================================== - Hits 63 60 -3 + Misses 18 12 -6 ``` | [Files](https://app.codecov.io/gh/JuliaGPU/Adapt.jl/pull/70?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGPU) | Coverage Δ | | |---|---|---| | [src/wrappers.jl](https://app.codecov.io/gh/JuliaGPU/Adapt.jl/pull/70?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGPU#diff-c3JjL3dyYXBwZXJzLmps) | `72.22% <58.82%> (+7.77%)` | :arrow_up: |

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.