Open tmigot opened 1 year ago
Patch coverage has no change and project coverage change: +3.29
:tada:
Comparison is base (
65cd309
) 81.46% compared to head (ec92e57
) 84.76%.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
I think, even though this PR might fix the issue (untested it seems) it's not the fix we want: The package already defines (too) many methods (see eg. https://github.com/JuliaDiff/ReverseDiff.jl/issues/226) and adding Diagonal
to the array type union will increase this nummber further by adding many more definitions - even though the issue was only about a specific method for *
. It also seems a bit surprising to treat Diagonal
in such a special way but not e.g. Adjoint
, Symmetric
and other matrix types in LinearAlgebra.
Has there been any progress on this?
Multiplying a Cholesky matrix with a Diagonal does not work with ReverseDiff, e.g. in the example shared here: https://github.com/TuringLang/Turing.jl/issues/1870#issuecomment-1706595909
There's no multiplication of a Diagonal
with a Cholesky
matrix in the linked example? The model only multiplies a Diagonal
with a LowerTriangular
matrix. In any case, you could rewrite the multiplication with a Diagonal
using broadcasting.
Generally, I still have the same feeling as in https://github.com/JuliaDiff/ReverseDiff.jl/pull/231#issuecomment-1550198020.
Sorry for being imprecise. The LowerTriangular
comes from a Cholesky
, though. It was recently implemented (which is great!) and I suspect I won't be the last one to try a similar operation.
The point you raised in https://github.com/JuliaDiff/ReverseDiff.jl/pull/231#issuecomment-1550198020 seems reasonable. I am pretty new to these ecosystem and have no feelings to this whatsoever. But I think it would be nice if models (like the one I linked) work as consistently as possible across AD backends. Or is this just not possible?
Close #223