I want to use RBM pretraining weights from Hinton paper code for weights of MATLAB native feedforwardnet toolbox. Anyone can help me how to set or arrange the pre-trained weight for feedforwardnet?
for instance, i used Hinton code from http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html
and use the pre-trained weights for matlab feedforwardnet.
W=hintonRBMpretrained;
net=feedforwardnet([700 300 200 30 200 300 700]);
net.setwb(net,W);
how to set up or arrange the W such that it will match the feedforwardnet structure? I know how to use single vector but i am afraid that the order or the weights sequence is incorrect.
The MATLAB
feedforwardnetfunction returns a Neural Network object with the properties as described in the documentation. The workflow for creating a neural network with pre-trained weights is as follows:The steps 1, 2, 3, and 5 are exactly as they would be when creating a neural network from scratch. Let's look at a simple example:
Now, we have a neural network
netwith 4 inputs (sepal and petal length and width), and 3 outputs ('setosa', 'versicolor', and 'virginica'). We have two hidden layers with 16 nodes each. The weights are stored in the two fieldsnet.IWandnet.LW, whereIWare the input weights, andLWare the layer weights:This is confusing at first, but makes sense: each row in both these cell arrays corresponds to one of the layers we have.
In the
IWarray, we have the weights between the input and each of the layers. Obviously, we only have weights between the input and the first layer. The shape of this weight matrix is16x4, as we have4inputs and16hidden units.In the
LWarray, we have the weights from each layer (the rows) to each layer (the columns). In our case, we have a16x16weight matrix from the first to the second layer, and a3x16weight matrix from the second to the third layer. Makes perfect sense, right?With that, we know how to initialize the weights we have got from the RBM code:
With that, you can continue with step 5, i.e. training the network in a supervised fashion.