Closed dsanalytics closed 2 years ago
No, it's not possible, unfortunately. Orange uses scikit learn's Multilayer Perceptron, which doesn't allow no hidden layers.
@dsanalytics, what is your goal? To be able to demonstrate that no-hidden-layer NN is equal to log reg?
@markotoplak I know that already. I guess sklearn devs know better then anyone else how Data Science or Algorithms R&D are supposed to be done. Saying in a paper: "Dear reader, I cannot have single layer NN to show you, but trust me, it's not a big deal as sklearn devs know that nobody in the world is using those anymore", does not look good.
Then don't say it.
Alternatively, explain that NN without a hidden layer is just logistic regression.
Saying: "Dear reader, I cannot have single layer NN to show you, but trust me, it's not a big deal as sklearn devs know that nobody in the world is using those anymore", does not look good.
I don't know what this is precisely referring to, but if it is to my comment, then you are misunderstanding how Orange is structured. We base our algorithms on sklearn as it is the best supported ML library for Python in existence. We will not be writing our own algorithms for NN just because you think sklearn's are not good enough for you. Your beef is with sklearn - write to them and argue for your use case. When they support it, we will.
@ajdapretnar that was to Marko's comment. I know that it's sklearn's fault and that you rely on it - hence I did not blame Orange.
It is not their fault. Calling a NN without hidden layer a NN is misleading. If somebody talks about NN with no hidden layers, I'd have to think for a while, then I'd ask him if he means logistic regression. And attempting to construct a NN without hidden layers is almost certainly by miskake, so it's better to report it as such.
What do you call polynomial regression with powers up to 0? I would prefer to call it "the average".
What do you call a classification tree with a depth of 0? I would prefer calling it "majority classifier".
What do you, then, call a NN without hidden layers?
A motor bike without a motor is just a bike.
Khm, I just checked, and sklearn.neural_network.MLPClassifier(hidden_layer_sizes=())
worked for me. So scikit-learn` does not limit it - Orange does. :)
I vote for allowing the widget to work without hidden layers. If nothing else, it makes demonstration of similarities between NN without hidden layers and other methods easier. That could be useful in an education setting.
I nevertheless don't like it.
What about the training? It does not work the same; does it (usually?) give the same result?
@janezd I do not want to get into a debate here too, but I disagree. SLNN is not the same as LR even with sigmoid as training differs: backprop vs max likelihood. But I'm sure you know all this professor.
@markotoplak great find!
What about the training? It does not work the same; does it (usually?) give the same result?
Yep, the training is different and has more options. For iris
it runs out of iterations by default, so it only managed to split class 0 from 1 and 2. But even that is interesting, because it shows how fitting the same model in different ways may present different outcomes.
Also, the regularization parameter works differently in between these two: alpha
versus C
.
But still, why not allow it? The only thing to modify (almost) is not to auto-fill the empty edit box.
SLNN is not the same as LR even with sigmoid as training differs: backprop vs max likelihood.
@dsanalytics, the training could be the same. NN backprop simplifies exactly to the same rules as when using gradient descent for logistic regression. It is just that usually libraries have different optimizers fro both problems.
@markotoplak I was testing it with
clf = MLPClassifier(solver='lbfgs', alpha=1e-5, hidden_layer_sizes=(0,), random_state=1)
Which probably means one cannot pass it explicit 0, but an empty edit box might work.
@markotoplak I was testing it with
clf = MLPClassifier(solver='lbfgs', alpha=1e-5, hidden_layer_sizes=(0,), random_state=1)
This means 1 hidden later with 0 neurons. No wonder it did not work. :)
Right, my bad. In this case, I suppose we should support it.
Ugh, I still think it's a bad idea, but OK if you think we must. Perhaps at least with a warning or information that it's better to use LR?
@ajdapretnar Thank you! @markotoplak Hopefully, you'll get a chance to add no-reg option to LR soon: https://github.com/biolab/orange3/issues/5816
I could be misreading NN widget help file [1], but is it possible to specify a single-layer NN - i.e. no hidden layers? If I enter 0 for the number of hidden layers neurons, it gives an error (screenshot no 1), while if I leave it blank, it defaults to 10. As an example, for a NN with 4 inputs, one output with logistic, what would I need to input in the neurons count box to get it (screenshot no 2)?
[1] Orange Data Mining - Neural Network Widget https://orangedatamining.com/widget-catalog/model/neuralnetwork/
Env: Windows + Orange 3.31