Closed ChrisRackauckas closed 1 year ago
Merging #204 (572d694) into master (02c95fe) will decrease coverage by
0.32%
. The diff coverage is0.00%
.
:mega: This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more
@@ Coverage Diff @@
## master #204 +/- ##
==========================================
- Coverage 73.46% 73.14% -0.32%
==========================================
Files 20 20
Lines 686 689 +3
==========================================
Hits 504 504
- Misses 182 185 +3
Impacted Files | Coverage Δ | |
---|---|---|
src/ComponentArrays.jl | 88.88% <ø> (ø) |
|
src/compat/chainrulescore.jl | 64.28% <0.00%> (-17.54%) |
:arrow_down: |
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
Functors is an extremely lightweight dependency (it literally just depends on LinearAlgebra).
I would guess you could turn it into a componentarray and then apply backing?
That was my first attempt. But Tangent
is not an AbstractArray so CA wouldn't flatten it.
Functors is an extremely lightweight dependency (it literally just depends on LinearAlgebra).
That should be fine, then.
This should be good to go now. Fixes the last bug in https://github.com/SciML/SciMLSensitivity.jl/pull/818
@avik-pal this needs
fmap
and thus a Functors.jl dependency. Could that be avoided somehow or is it necessary? I would guess you could turn it into a componentarray and then applybacking
?