Keras How to use max_value in Relu activation function

9.6k views Asked by At

Relu function as defined in keras/activation.py is:

    def relu(x, alpha=0., max_value=None):
      return K.relu(x, alpha=alpha, max_value=max_value)

It has a max_value which can be used to clip the value. Now how can this be used/called in the code? I have tried the following: (a)

    model.add(Dense(512,input_dim=1))
    model.add(Activation('relu',max_value=250))
    assert kwarg in allowed_kwargs, 'Keyword argument not understood: 
    ' + kwarg
    AssertionError: Keyword argument not understood: max_value

(b)

    Rel = Activation('relu',max_value=250)

same error

(c)

    from keras.layers import activations
    uu = activations.relu(??,max_value=250)

The problem with this is that it expects the input to be present in the first value. The error is 'relu() takes at least 1 argument (1 given)'

So how do I make this a layer?

    model.add(activations.relu(max_value=250))

has the same issue 'relu() takes at least 1 argument (1 given)'

If this file cannot be used as layer, then there seems to be no way of specifying a clip value to Relu. This implies that the comment here https://github.com/fchollet/keras/issues/2119 closing a proposed change is wrong... Any thoughts? Thanks!

4

There are 4 answers

0
hans On

That is as easy as one lambda :

from keras.activations import relu
clipped_relu = lambda x: relu(x, max_value=3.14)

Then use it like this:

model.add(Conv2D(64, (3, 3)))
model.add(Activation(clipped_relu))

When reading a model saved in hdf5 use custom_objects dictionary:

model = load_model(model_file, custom_objects={'<lambda>': clipped_relu})
0
MacChuck On

Tested below, it'd work:

import keras

def clip_relu (x): 
    return keras.activations.relu(x, max_value=1.)

predictions=Dense(num_classes,activation=clip_relu,name='output')
0
Hongye Yang On

This is what I did using Lambda layer to implement clip relu: Step 1: define a function to do reluclip:

def reluclip(x, max_value = 20):
    return K.relu(x, max_value = max_value)

Step 2: add Lambda layer into model: y = Lambda(function = reluclip)(y)

2
Markus Eisenbach On

You can use the ReLU function of the Keras backend. Therefore, first import the backend:

from keras import backend as K

Then, you can pass your own function as activation using backend functionality. This would look like

def relu_advanced(x):
    return K.relu(x, max_value=250)

Then you can use it like

model.add(Dense(512, input_dim=1, activation=relu_advanced))

or

model.add(Activation(relu_advanced))

Unfortunately, you must hard code additional arguments. Therefore, it is better to use a function, that returns your function and passes your custom values:

def create_relu_advanced(max_value=1.):        
    def relu_advanced(x):
        return K.relu(x, max_value=K.cast_to_floatx(max_value))
    return relu_advanced

Then you can pass your arguments by either

model.add(Dense(512, input_dim=1, activation=create_relu_advanced(max_value=250)))

or

model.add(Activation(create_relu_advanced(max_value=250)))