Open vlc1 opened 1 year ago
Totals | |
---|---|
Change from base Build 2991758705: | -0.0% |
Covered Lines: | 1933 |
Relevant Lines: | 1986 |
I believe we've gone back and forth on this. Long ago there was overhead to broadcasting, so the BLAS routines were called. Then Julia improved and folks wanted broadcasting because it's more generic. I think I'm happy with the current broadcasting stuff, also because for example CG would need an axpby
somewhere, which I think is not standard BLAS.
These functions are now owned by LinearAlgebra (and not LinearAlgebra.BLAS
) and admit generic fallbacks, likely using broadcasting but I don't remember. Another option could be to introduce _axp[b]y!
, that use broadcasting generically but may be overloaded with a custom axpy!
-like implementation.
These functions are now owned by LinearAlgebra (and not
LinearAlgebra.BLAS
) and admit generic fallbacks, likely using broadcasting but I don't remember. Another option could be to introduce_axp[b]y!
, that use broadcasting generically but may be overloaded with a customaxpy!
-like implementation.
Fair enough, I might have to dig into how to customize broadcasting for my own datatypes...
These functions are now owned by LinearAlgebra (and not
LinearAlgebra.BLAS
) and admit generic fallbacks, likely using broadcasting but I don't remember. Another option could be to introduce_axp[b]y!
, that use broadcasting generically but may be overloaded with a customaxpy!
-like implementation.
That's pretty much what Krylov.jl does with some macros (@kaxpy
...).
When using IterativeSolvers.jl on custom vectors and matrices, I find it easier to add methods to generic BLAS routines (
scal!
,mul!
,axpy!
...) than customizing the broadcasting machinery on said types.This is a very small modification that simply requires changing lines like
to
and so on as illustrated in the cg.jl file in this PR.
If this is a modification that is acceptable, I'll work my way through the other solvers (bicgstabl...).