Difference in calculating derivative of RBM while using ContrastiveDivergence

79 views Asked by At

could anybody explain me difference between calculating derivative in RBM with -h_j * x_k and - h_j(x) * x_k? I found source codes with both implementations and I am not sure which one is better (and why?)

1

There are 1 answers

0
Rudra Murthy On

By h_j and h_j(x) are you referring to the j^th hidden neuron sample or the j^th hidden neuron sigmoidal activation? Assuming so, if you calculate the derivative of the negative log-likelihood function, it turns out to be P(h_j = 1) * x_k.