Regularization losses Tensorflow - TRAINABLE_VARIABLES to Tensor Array

1.4k views Asked by At

I would like to add both L1 and L2 Regularization to my loss function. When I define the weight variable I choose the regularization to use, but it seems I can only choose one.

regLosses=tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)
loss=tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(y_conv,y_))+regLosses

when I try to get the losses manually by

weights=tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES)
l1Loss=tf.reduce_sum(tf.abs(weights))
l2Loss=tf.nn.l2loss(weights)
loss=tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(y_conv,y_))+.1*l1Loss+.001*l2Loss

It doesn't work - I think because TRAINABLE_VARIABLES returns the variables not the parameters. How do i fix this? Is my manual calculation of l1 loss correct?

Thanks in advance

1

There are 1 answers

0
The Puternerd On BEST ANSWER

So I think I discovered the answer. Comments and review welcome.

When I create the weights I use: W=tf.get_variable(name=name,shape=shape,regularizer=tf.contrib.layers.l1_regularizer(1.0))

Noting that the l1 regularization is simply the sum of the absolute values of the weights and that l2 is the squared of the weights, then I can do the following.

regLosses=tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)
l1=tf.reduce_sum(tf.abs(regLosses))
l2=tf.reduce_sum(tf.square(reglosses))