Closed ghost closed 3 years ago
Merging #47 (007f698) into master (c5e60bf) will decrease coverage by
1.57%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #47 +/- ##
==========================================
- Coverage 93.03% 91.46% -1.58%
==========================================
Files 10 10
Lines 589 586 -3
==========================================
- Hits 548 536 -12
- Misses 41 50 +9
Impacted Files | Coverage Δ | |
---|---|---|
src/alspgrad.jl | 85.14% <100.00%> (-5.31%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update c5e60bf...007f698. Read the comment docs.
@andreasnoack, could you please review this PR?
@andreasnoack, could I ask what's blocking from merging this PR? I think fixing #29 makes CI effectively useful.
Hi @tsano430. I'm not really qualified to review this and, unfortunately, I also have too much time for general reviews for the time being. This package hasn't been maintained for a while so your contributions are appreciated. I've invited you to the repo such that you can more easily make progress here. It would be good to continue to let PRs stay open for a couple of days but generally feel free to merge PRs that haven't received any suggestions for changes.
@andreasnoack, thank you for your polite response. I've accepted your invitation to join JuliaStats/NMF.jl. From now on, I will contribute to this package in keeping with your advice.
I fixed #29 by reimplementing
alspgrad.jl
as in the following original paper:Chih-Jen Lin, Projected Gradient Methods for Non-negative Matrix Factorization, Neural Computing, 19 (2007).
Here is the MATLAB code in the original paper. In addition, I fixed the parameters
maxiter
andtolg
intest/alspgrad.jl
because they preventedalspgrad.jl
from obtaining a high-precision solution.