Gaussian-RBM with NRLU hidden units (in DBN)?

1.4k views Asked by At

I'm working on a RBM (for a DBN) for image classification. I' working with two RBM layers. The first has Gaussian visible units and binary hidden units and the second has binary visible units and softmax hidden units. It works quite well. I now want to try to use Noise Rectified Linear Unit as the hidden layer, but I fail to understand how to implement them. Everything I've tried has just led to terrible results.

Now, if I understand correctly: The activation probability of a ReLU is simply p = max(0, x + N(0, 1)), but then how do I sample the values to activate the visible units ? Should the noise only be used in sampling and not in activation probabilities ?

Another thing: in some papers I saw that the noise is to be N(0,1) and some others use N(0,sigmoid(x)).

So, what should be the activation function and how the values should be sampled ?

1

There are 1 answers

1
Baptiste Wicht On BEST ANSWER

Apparently:

Using max(0, x) as activation probability function and using max(0, x + N(0, Sigmoid(x)) for sampling seems to work for the RBM.