In Gradient Descent algorithm, how to induce -2*wx

36 views Asked by At

part of Gradient Descent algorithm

this.updateWeights = function() {
 
  let wx;
  let w_deriv = 0;
  let b_deriv = 0;

  for (let i = 0; i < this.points; i++) {
    wx = this.yArr[i] - (this.weight * this.xArr[i] + this.bias);
    w_deriv += -2 * wx * this.xArr[i];
    b_deriv += -2 * wx;
  }
  
  this.weight -= (w_deriv / this.points) * this.learnc;
  this.bias -= (b_deriv / this.points) * this.learnc;
}
            

explain this part please!!

-2 * wx * this.xArr[i]

this part is induced....?

how to induce by math formula....

1

There are 1 answers

0
Sen Lin On

It's derived from the partial derivative of the MSE loss function with respect to w. I wrote a simplified induction process on paper, hoping it's helpful. attached image