Closed AnchorBlues closed 6 years ago
Merging #249 into master will decrease coverage by
0.04%
. The diff coverage is0%
.
@@ Coverage Diff @@
## master #249 +/- ##
==========================================
- Coverage 56.8% 56.75% -0.05%
==========================================
Files 7 7
Lines 1301 1302 +1
Branches 261 261
==========================================
Hits 739 739
- Misses 492 495 +3
+ Partials 70 68 -2
Impacted Files | Coverage Δ | |
---|---|---|
pyglmnet/pyglmnet.py | 73.33% <0%> (-0.17%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 84869a0...bfdb111. Read the comment docs.
Merging #249 into master will increase coverage by
1.64%
. The diff coverage is100%
.
@@ Coverage Diff @@
## master #249 +/- ##
==========================================
+ Coverage 56.8% 58.44% +1.64%
==========================================
Files 7 7
Lines 1301 1302 +1
Branches 261 261
==========================================
+ Hits 739 761 +22
+ Misses 492 472 -20
+ Partials 70 69 -1
Impacted Files | Coverage Δ | |
---|---|---|
pyglmnet/pyglmnet.py | 78.22% <100%> (+4.72%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 84869a0...5b9668e. Read the comment docs.
Thanks. Could you add a tiny test?
@jasmainak Sorry, I don't know what kind of test code I should write. I would be glad if you show me some example test code.
Think about how you discovered the bug. The test should basically reproduce that behavior in some sense. It should fail on master but pass on this branch.
You can add the test in this function. One easy check perhaps could be weather the estimated betas have the same non-zero indices as the simulated one.
@jasmainak Thank you for your advice. I added test code to test_group_lasso function. Please confirm.
@jasmainak
I found that last assertion of test_group_lasso
function,
assert (beta[group_norms <= thresh] == 0.0).all()
is not appropriate because there is a possibility that beta[group_norms <= thresh]
will be nonzero if the values of result[idx_to_update]
changes to the values less than thresh
at the final iteration of optimization.
So I removed the assertion from the test code.
@pavanramkumar please merge if you are happy. LGTM
Thanks @AnchorBlues and @jasmainak — great contribution!
@AnchorBlues looking forward to more contributions and bugfixes. Thanks!
@jasmainak @pavanramkumar Thank you for reviewing my fix and merging it. I am glad to contribute to your package. I'll continue to use this package for data science. Thanks!
In proximal gradient method, weights where those absolute values are smaller than the threshold must be 0.