How to backpropagate in multi-layer perceptron (MLP)?

42 views Asked by At

This is my code for the backpropagation function

back(target) {
// Learning rate
const learningRate = 0.1;

let delta = null;

for (let layer = this.layers.length - 1; layer >= 0; layer--) {
  const neurons = this.layers[layer];

  // Calculating gradient for output neurons is different from hidden layers
  if (layer == this.layers.length - 1) {
    delta = neurons.map((n, i) => (n - target[i]) * this.sigmoid_deriv(n));
    // Update biases of the output neurons

    this.biases[layer - 1] = this.biases[layer - 1].map(
      (b, i) => b - learningRate * delta[i]
    );

    continue;
  }

  const weights = this.weights[layer];

  const w_gradient = neurons.map((n) => delta.map((d) => n * d));

  this.weights[layer] = w_gradient.map((gradients, n) =>
    gradients.map((gradient, w) => weights[n][w] - learningRate * gradient)
  );

  // update delta for next itereation
  delta = multiply(
    dot(delta, transpose(weights)),
    neurons.map((n) => this.sigmoid_deriv(n))
  );

   if (layer > 0) {
    this.biases[layer - 1] = this.biases[layer - 1].map(
      (b, i) => b - learningRate * delta[i]
    );
   }
  }
}

I'm generating weights and biases with random between 0-1. Everything seems to be okay, but it is not learning at all. I'm not sure if I understood the calculation of the gradient correctly at this point. Any help or direction point is appreciated.

0

There are 0 answers