Closed iamDecode closed 3 years ago
Merging #25 (84f380b) into master (afb42d0) will not change coverage. The diff coverage is
100.00%
.
@@ Coverage Diff @@
## master #25 +/- ##
==========================================
Coverage 100.00% 100.00%
==========================================
Files 12 12
Lines 624 729 +105
==========================================
+ Hits 624 729 +105
Impacted Files | Coverage Δ | |
---|---|---|
sklearn_pmml_model/ensemble/__init__.py | 100.00% <100.00%> (ø) |
|
sklearn_pmml_model/ensemble/forest.py | 100.00% <100.00%> (ø) |
|
sklearn_pmml_model/ensemble/gb.py | 100.00% <100.00%> (ø) |
|
sklearn_pmml_model/tree/__init__.py | 100.00% <100.00%> (ø) |
|
sklearn_pmml_model/tree/tree.py | 100.00% <100.00%> (ø) |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update afb42d0...84f380b. Read the comment docs.
As gradient boosting uses regression trees internally, #23 already contained the majority of preparation for supporting regression in tree-based models. This PR adds two new classes
PMMLTreeRegressor
(extendingDecisionTreeRegressor
) andPMMLForestRegressor
(extendingRandomForestRegressor
) for regression problems.