Open NiclasSchwalbe opened 5 years ago
Hey,
I have created a Network like this:
`nn = new BasicNetwork(); nn.addLayer(new BasicLayer(null, true, 21)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 100)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 50));
nn.addLayer(new BasicLayer(new ActivationSigmoid(), false, 4)); nn.getStructure().finalizeStructure(); nn.reset();`
After this I created an Output method:
` public double[] getOutput(MLData input) {
double[] output = nn.compute(input).getData(); for(double w: output) { if(w>1 || w < 0.0)System.out.println(w); } return output;}
`
This NN is able to return Values smaller zero and bigger one. How on earth is this possible. I check your sigmoids; they work well. Are there weights after the last layer? How to destroy them?
Hey,
I have created a Network like this:
`nn = new BasicNetwork(); nn.addLayer(new BasicLayer(null, true, 21)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 100)); nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 50));
After this I created an Output method:
` public double[] getOutput(MLData input) {
`
This NN is able to return Values smaller zero and bigger one. How on earth is this possible. I check your sigmoids; they work well. Are there weights after the last layer? How to destroy them?