Closed ChrisRackauckas closed 2 years ago
Merging #168 (8691112) into master (f10cda4) will increase coverage by
0.02%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #168 +/- ##
==========================================
+ Coverage 78.86% 78.89% +0.02%
==========================================
Files 14 14
Lines 743 744 +1
==========================================
+ Hits 586 587 +1
Misses 157 157
Impacted Files | Coverage Δ | |
---|---|---|
src/differentiation/compute_jacobian_ad.jl | 94.17% <100.00%> (+0.02%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update f10cda4...8691112. Read the comment docs.
Nope, @YingboMa that also causes:
julia> itrigs = inference_triggers(tinf)
6-element Vector{InferenceTrigger}:
Inference triggered to call hcat(::BitMatrix, ::BitMatrix) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:67) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call Vector{Tuple{Float64, Float64, Float64}}(::UndefInitializer, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:73) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call ntuple(::SparseDiffTools.var"#15#16"{3, Int64, Int64}, ::Val{3}) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call setindex!(::Vector{Vector{Tuple{Float64, Float64, Float64}}}, ::Vector{Tuple{Float64, Float64, Float64}}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:77) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel
I'll try to play with this locally and submit a PR.
Actually, this seems better: https://github.com/JuliaDiff/SparseDiffTools.jl/pull/169/files
On this example:
we saw just two inference triggers:
The second one is clear: okay, just use ntuple on like 75 right? So I did that, and it made the function more type-unstable? Look at the itrigs after that:
@chriselrod or @yingboma could I get some Cthulhu magic to look at what's going on there?