Closed salamanders closed 8 years ago
Either "author intended to make this algorithm Parameterized but likely hasn't gotten to implementing it yet"? OR that there are no parameters that need to be tuned for it to work well. Since a lot of times there are parameters that can have a single default value that works most of the time, and adding it to the search would be more work for little to no gain (even negative gain!). Thats one reason why I liked the priority idea you had in another issue. You could have "High" i.e.: needs to be tuned no matter what, "medium": can tune if you wan't, not a big deal, and low: almost never needs to be adjusted. But not sure how I want to implement that yet. And coding taking a break to health stuff.
Very sorry to hear about health stuff! Best of luck, hope it works out ok.
Me too! Its slowly working out, hoping to be back to normal by the summer.
I added a note about this in the documentation, so I'm just going to close this issue.
What should be the interpretation of the case when
classifier instanceof Parameterized
, butautoAddParameters(trainDS)==0
?SAMME, ArcX4, ModestAdaBoost, DecisionStump, AdaBoostM1PL, StochasticMultinomialLogisticRegression, NaiveBayes, DDAG
Should I take it as "for this specific dataset, there were no good parameters for RandomSearch to chew on -- but keep trying, other DataSets may work better!" or as "author intended to make this algorithm Parameterized but likely hasn't gotten to implementing it yet"?