SciML / DifferenceEquations.jl

Solving difference equations with DifferenceEquations.jl and the SciML ecosystem.
MIT License
32 stars 6 forks source link

Benchmark likelihood with a noise matrix vs. one doing a transformation of vector to vector in turing #28

Closed jlperla closed 2 years ago

jlperla commented 2 years ago

The current likelihood within turing does

ϵ_draw ~ MvNormal(m.n_ϵ * T, 1.0)
ϵ = map(i -> ϵ_draw[((i-1)*m.n_ϵ+1):(i*m.n_ϵ)], 1:T)

and then passes the ϵ vector of vectors to the solve. If we have a custom adjoint, we can instead use a reshape on the ϵ to a matrix and avoid that transformation within the likleihood calculation itself. It is hard to know whether the backprop of the vector of vector ϵ is slow without benchmarking it.

Regardless, inside of the solve with a custom adjoint the performance should be irrelevant because we can use efficient views and mutating matrices. That wasnt necessaily true when Zygote needed to differentiate it direclty.