Closed tiemvanderdeure closed 1 month ago
I think the test failures are unrelated to this PR?
@tiemvanderdeure Thanks for this PR, which looks like a valuable contribution. I'll try to investigate the fail soon and get back you.
@tiemvanderdeure The source of your fail is sorted. Do you mind rebasing onto dev?
Thanks for reviewing. I'll add some tests and documentation later this week!
@tiemvanderdeure I'm tweaking the tests a bit at #251. I suggest you wait till I finish that before writing your tests. Sorry for any inconvenience.
Regarding tests: I don't think it is nessary to run testoptimiser
or to compare predictions b/w CPU/GPU, as we do for the other classifier. The basictest
should suffice.
Re documentation. In view of #252 just merged, you just need to update the table at https://github.com/FluxML/MLJFlux.jl/blob/dev/docs/src/interface/Summary.md and write a model doc-string, modelled on the existing classifier.
I just added some quick docs - most of it is copied over from the docs for NeuralNetworkClassifier
.
Should I still wait for https://github.com/FluxML/MLJFlux.jl/pull/251 before I can update the tests?
Attention: Patch coverage is 17.64706%
with 14 lines
in your changes are missing coverage. Please review.
Project coverage is 88.18%. Comparing base (
ec6004f
) to head (5842ed9
). Report is 1 commits behind head on dev.
Files | Patch % | Lines |
---|---|---|
src/classifier.jl | 18.18% | 9 Missing :warning: |
src/core.jl | 0.00% | 5 Missing :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Should I still wait for https://github.com/FluxML/MLJFlux.jl/pull/251 before I can update the tests?
Yes, I think that's close to finished. Just waiting on the follow up review.
@tiemvanderdeure FYI #251 is now merged. Let me know if you need guidance on merge conflicts.
Okay, @tiemvanderdeure I'm keen to move this along, now that a bunch of other PR's are also coming in. I'll resolve the conflicts myself, add tests and merge myself. Will let you know if I encounter problems.
Thanks indeed for this valuable contribution 🙏🏾
closed in favour of #257
All right, thanks for finishing this @ablaom! Be sure to let me know if you need me to review or anything.
All right, thanks for finishing this @ablaom! Be sure to let me know if you need me to review or anything.
Great. Definitely add you to my list :heart:
This PR adds a separate constructor
NeuralNetworkBinaryClassifier
for binary classification.Unlike for the existing
NeuralNetworkClassifier
constructor, in the final layer always has size 1. By default, a sigmoid function is used to transform this layer. This is a pretty common thing to use neural networks for, and as far as I can tell it's not possible with the existing constructors.I'll add tests and documentation if this PR gets some support.
PR Checklist