How LeakyReLU layer works without setting the number of units?

820 views Asked by At

When building Sequential model, I notice there is a difference between adding relu layer and LeakyReLU layer.

test = Sequential()
test.add(Dense(1024, activation="relu"))
test.add(LeakyReLU(0.2))
  1. Why cant we add layer with activation = "LeakyReLU" ? (LeakyReLU is not a string which keras can work with)
  2. When adding relu layer, we set the number of units (1024 in my example) Why can't we do the same for LeakyReLU ?

I was sure that the different between relu and LeakyReLU is the method behavior, but it seems more than that.

2

There are 2 answers

0
user3668129 On
    import tensorflow as tf
    test = Sequential()
    test.add(Dense(1024, input_dim=784, activation="relu", name="First"))
    test.add(Dense(512,  activation=tf.keras.layers.LeakyReLU(alpha=0.01), name="middle"))
    test.add(Dense(1, activation='sigmoid', name="Last"))
    test.compile(loss='binary_crossentropy', optimizer="adam")
    print(test.summary())

ouput:

Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
First (Dense)                (None, 1024)              803840    
_________________________________________________________________
middle (Dense)               (None, 512)               524800    
_________________________________________________________________
Last (Dense)                 (None, 1)                 513       
=================================================================
1
Kishore Sampath On
  1. We could specify the activation function in the dense layer itself, by using aliases like activation='relu', which would use the default keras parameters for relu. There is no such aliases available in keras, for LeakyRelu activation function. We have to use tf.keras.layers.LeakyRelu or tf.nn.leaky_relu.

  2. We cannot set number of units in Relu layer, it just takes the previous output tensor and applies the relu activation function on it. You have specified the number of units for the Dense layer not the relu layer. When we specify Dense(1024, activation="relu") we multiply the inputs with weights, add biases and apply relu function on the output (all of this is mentioned on a single line). From the method mentioned on step 1, this process is done in 2 stages firstly to multiply weights, add biases and then to apply the LeakyRelu activation function (mentioned in 2 lines).