-
Notifications
You must be signed in to change notification settings - Fork 266
Open
Labels
Description
Hey,
I have created a Network like this:
`nn = new BasicNetwork();
nn.addLayer(new BasicLayer(null, true, 21));
nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200));
nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200));
nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 200));
nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 100));
nn.addLayer(new BasicLayer(new ActivationSigmoid(), true, 50));
nn.addLayer(new BasicLayer(new ActivationSigmoid(), false, 4));
nn.getStructure().finalizeStructure();
nn.reset();`
After this I created an Output method:
`
public double[] getOutput(MLData input) {
double[] output = nn.compute(input).getData();
for(double w: output) {
if(w>1 || w < 0.0)System.out.println(w);
}
return output;}
`
This NN is able to return Values smaller zero and bigger one. How on earth is this possible. I check your sigmoids; they work well.
Are there weights after the last layer? How to destroy them?