Tensorflow dense layer operation

801 views Asked by At
a = tf.random_uniform([5, 3, 5])
b = tf.random_uniform([5, 1, 6])

tiled_b = tf.tile(b, [1, 3, 1])
c = tf.concat([a, tiled_b], 2)
d = tf.layers.dense(c, 10, activation=tf.nn.relu)

Here the output shape turned out to be 5x3x10. The input shape is 5x3x11. I've seen the source code of this operation and found that the weights matrix is of shape 11x10. I also understand that the operation is similar to res = np.tensordot(input,weights,axes=([2],[0])). What I don't understand is how this is happening. How do I visualize this operation in a neural network? Since, the dense layer is just a single layer with 10 neurons, how can the weight matrix be 11x10?

1

There are 1 answers

2
Ishant Mrinal On BEST ANSWER

for a dense layer every input channel is connected to each output neuron with a weight. So here input_channel=11 and ouput_channel=10, so number of weights 11x10.

# input 5x3x11, here last dimension is the input channel
dense_layer_weight_shape = [input_channel, output_channel]