How implement Leaky ReLU in Keras from scratch?

855 views Asked by At

How to implement Leaky ReLU from scratch and use it as a custom function in Keras, I have a rough snippet but am not sure how close I am to the correct definition. My question comes in two parts:

1-Is my implementation correct?

2-If not, what am I doing wrong?

The implementation am using:

from keras import backend as K 
from keras.layers import Conv3D

def leaky_relu(x):
   alpha = 0.1
   return K.maximum(alpha*x, x)

And usage :

x = Conv3D(64, kernel_size=(3, 3, 3), activation=leaky_relu, padding='same', name='3D_conv')(x)

Any help would be very appreciated.

1

There are 1 answers

0
Chicodelarose On BEST ANSWER

Yes, it is correct. I made a slight modification to the function to make it more reusable:

def LeakyReLU(alpha = 1):
    return lambda x : tf.keras.backend.maximum(alpha * x, x)

In this way, you could call the activation with different values of alpha:

x = Conv3D(64, kernel_size=(3, 3, 3), activation=LeakyReLU(0.1), padding='same', name='3D_conv')(x)