Closed dlcole3 closed 2 years ago
Note that the current formulation includes a K
matrix in the sparse model. I now realize I don't need that, but I will fix that on a separate PR. I already had the new functions enabled with K
when I learned I did not need to use K
with the sparse form
Merging #42 (ba38441) into main (0365640) will increase coverage by
0.51%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## main #42 +/- ##
==========================================
+ Coverage 97.30% 97.81% +0.51%
==========================================
Files 4 4
Lines 1188 1466 +278
==========================================
+ Hits 1156 1434 +278
Misses 32 32
Impacted Files | Coverage Δ | |
---|---|---|
src/LinearQuadratic/sparse.jl | 100.00% <100.00%> (ø) |
:mega: Codecov can now indicate which changes are the most critical in Pull Requests. Learn more
Added new
_build
and_set
functions in thesparse.jl
source code to support sparseA, B, Q, R, K,
andS
matrices. Currently, when the problem becomes very large, the sparse formulation is not very fast and it takes a lot of memory allocation. This is largely because the old code treated allA, B, Q, R, K,
andS
matrices as being dense. Now, the Hessian and Jacobian are set using therowval
,nzval
, andcolptr
attributes ofA, B, Q, R, K,
andS
. This significantly reduces the memory required to build theSparseLQDynamicModel
.For N = 50, nu = 10, ns = 2000, I got the following results using
@time
dense LQDynamicData: 18.737979 seconds (200.89 k allocations: 9.577 GiB, 9.12% gc time) sparse LQDynamicData: 0.671002 seconds (1.51 M allocations: 134.857 MiB, 2.67% gc time)Also added tests to ensure that the Hessian and Jacobian resulting from the new sparse functions are the same as before.