EDIT: I managed to get together couple of simple examples https://github.com/developer239/neural-network-playground
I just started playing with neataptic. I wanted to make the neural network to learn how to count using number: 1, 2, 3, 4, 5, 6, 7, 8, 9.
I normalized my inputs to 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9.
Then I wrote really simple training program that would teach the net how to add 1 + 2 (0.1 + 0.3).
const architect = require('neataptic').architect
const myTrainingSet = [
{ input: [0.1, 0.2], output: [0.3] },
{ input: [0.2, 0.1], output: [0.3] }
];
myNetwork = architect.Perceptron(2, 3, 1);
myNetwork.train(myTrainingSet, {
log: 1,
error: 0.01,
iterations: 1000,
rate: 0.3
});
console.log(myNetwork.activate([0,0]));
console.log(myNetwork.activate([1,1]));
console.log(myNetwork.activate([0.1,0.2]));
The problem is that this logs:
[ 0.3717501873608793 ]
[ 0.3695919770977549 ]
[ 0.37142744367869446 ]
It basically logs 0.3 for every input. Can some please explain what have I done wrong? :)
The dataset is too small for the neural network to learn from patterns. You have only provided it with samples that have
0.3as output. The neural network minimizes its error by always outputting0.3, as that is exactly how it was trained. I have created an example with 1000 (dynamically generated) samples, which seems to work:JSFiddle