MadNLP / DynamicNLPModels.jl

NLPModels for dynamic optimization
MIT License
11 stars 1 forks source link

Improved Efficiency of Sparse Model construction #32

Closed dlcole3 closed 2 years ago

dlcole3 commented 2 years ago

The sparse model construction was very slow because of several for loops for building the sparse H and J matrices that contained setting sparse arrays with dense arrays (e.g., H[..., ...] = Q). This PR now builds the H and J matrices by instead building vectors for colptr, rowval, and nzval from the corresponding data, and then using the SparseMatrixCSC constructor to create the sparse array from these vectors. The colptr, rowval, and nzval for H and J are instantiated within the _build_sparse_lq_dynamic_model function. The _set_sparse_H! and _set_sparse_J! functions are then called to set this data. For large problems (N, ns, and nu ~100+), there is significant speedup from these changes.

codecov-commenter commented 2 years ago

Codecov Report

Merging #32 (641c280) into main (fe9901e) will increase coverage by 0.18%. The diff coverage is 100.00%.

@@            Coverage Diff             @@
##             main      #32      +/-   ##
==========================================
+ Coverage   97.79%   97.97%   +0.18%     
==========================================
  Files           1        1              
  Lines         724      790      +66     
==========================================
+ Hits          708      774      +66     
  Misses         16       16              
Impacted Files Coverage Δ
src/DynamicNLPModels.jl 97.97% <100.00%> (+0.18%) :arrow_up:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update fe9901e...641c280. Read the comment docs.