Weights in feed-forward backpropogation ANN not changing

74 views Asked by At

I am designing a Feed-Forward BackPropogation ANN with 22 inputs and 1 output (either a 1 or 0). The NN has 3 layers and is using 10 hidden neurons. When I run the NN it only changes the weights a tiny bit and the total error for the output is about 40%. Intially, I thought it was over/under fitting but after I changed the number of hidden neurons, nothing changed.

N is the number of inputs (22)

M is the number of hidden neurons (10)

This is the code that I am using to backpropagate

oin is the output calculated before putting into sigmoid function

oout is the output after going through sigmoid function

double odelta = sigmoidDerivative(oin) * (TARGET_VALUE1[i] - oout);
    double dobias = 0.0;
    double doweight[] = new double[m];

    for(int j = 0; j < m; j++)
    {
        doweight[j] = (ALPHA * odelta * hout[j]) + (MU * (oweight[j] - oweight2[j]));
        oweight2[j] = oweight[j];
        oweight[j] += doweight[j];
    } // j

    dobias = (ALPHA * odelta) + (MU * (obias - obias2));
    obias2 = obias;
    obias += dobias;

    updateHidden(N, m, odelta);

This is the code I am using to change the hidden neurons.

 for(int j = 0; j < m; j++)
        {
            hdelta = (d * oweight[j]) * sigmoidDerivative(hin[j]);

            for(int i = 0; i < n; i++)
            {
                dhweight[i][j] = (ALPHA * hdelta * inputNeuron[i]) + (MU * (hweight[i][j] - hweight2[i][j]));
                hweight2[i][j] = hweight[i][j];
                hweight[i][j] += dhweight[i][j];


            } 

            dhbias[j] = (ALPHA * hdelta) + (MU * (hbias[j] - hbias2[j]));
            hbias2[j] = hbias[j];
            hbias[j] += dhbias[j];
        } `
1

There are 1 answers

1
jfedail On

You are learning your network to output on one node two classes. the weights connected to this network are adapting to predict a single class then another. so most of the time your weights are adapted to the dominate class in your data. to avoid having this problem add another node to have two nodes on your output each one refer to one class.