QB3 / sparse-ho

Fast hyperparameter settings for non-smooth estimators:
http://qb3.github.io/sparse-ho
BSD 3-Clause "New" or "Revised" License
39 stars 15 forks source link

Isolated optimizer #51

Closed QB3 closed 3 years ago

QB3 commented 3 years ago

The goal of this PR is to decouple the outer optimization process from the grad_search function. This paves the way for naive gradient descent implementation for debugging, as well more refined optimizer such as adam.

closes #50

codecov-io commented 3 years ago

Codecov Report

Merging #51 (cdab28c) into master (138d935) will decrease coverage by 0.85%. The diff coverage is 47.57%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #51      +/-   ##
==========================================
- Coverage   66.01%   65.15%   -0.86%     
==========================================
  Files          35       41       +6     
  Lines        2704     2801      +97     
  Branches      247      255       +8     
==========================================
+ Hits         1785     1825      +40     
- Misses        853      910      +57     
  Partials       66       66              
Impacted Files Coverage Δ
sparse_ho/optimizers/base.py 0.00% <0.00%> (ø)
sparse_ho/optimizers/adam.py 12.90% <12.90%> (ø)
sparse_ho/optimizers/line_search_wolfe.py 13.15% <13.15%> (ø)
sparse_ho/optimizers/gradient_descent.py 19.04% <19.04%> (ø)
sparse_ho/optimizers/line_search.py 68.49% <68.49%> (ø)
sparse_ho/ho.py 100.00% <100.00%> (+50.00%) :arrow_up:
sparse_ho/optimizers/__init__.py 100.00% <100.00%> (ø)
sparse_ho/tests/test_elastic.py 98.95% <100.00%> (+0.04%) :arrow_up:
sparse_ho/tests/test_grad_search.py 100.00% <100.00%> (ø)
sparse_ho/tests/test_logreg.py 98.42% <100.00%> (+0.07%) :arrow_up:
... and 8 more

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 138d935...cdab28c. Read the comment docs.