How to add multiple losses into gradienttape

885 views Asked by At

I am testing tf.gradienttape. I wrote a model with several output layers, each with an own loss, where i wanted to integrate the gradienttape. My question is: are there specific techniques how to implement the several losses to the gradient as target? I know one option is to take the mean of the losses. Is that always necessary? Can't I just input a list of losses and the gradienttape knows which losses belong to which output layer?

2

There are 2 answers

0
man hou On

In the TensorFlow document: Unless you set persistent=True a GradientTape can only be used to compute one set of gradients.

To calculate multiple losses, you need multiple tapes. Something like:

with tf.GradientTape() as t1:
    loss1_result= loss1(true, pred)
grads1 = t1.gradient(loss1_result, var_list1)
with tf.GradientTape() as t2:
    loss2_result= loss2(true, pred)
grads2 = t2.gradient(loss2_result, var_list2)

Then apply it.

opt1.apply_gradients(zip(grads1, var_list1))
opt2.apply_gradients(zip(grads2, var_list2))
0
user22601615 On

Example 1:Default

The default option add the gradients of the two losses within the gradient function.

with tf.GradientTape(persistent=True) as tp:
    logits_1,logits_2=model(X_train)
    loss1= loss_fn_1(y_train_1, logits_1)
    loss2= loss_fn_2(y_train_2, logits_2)
grads = tp.gradient([loss1,loss2], model.weights)
opt.apply_gradients(zip(grads, model.weights))

Example 2: Alternative

The alternative options gives you more freedom by allowing you to check the relations between the generated gradients.

with tf.GradientTape(persistent=True) as tp:
    logits_1,logits_2=model(X_train)
    loss1= loss_fn_1(y_train_1, logits_1)
    loss2= loss_fn_2(y_train_2, logits_2)
grads_1 = tp.gradient(loss1, model.weights)
grads_2 = tp.gradient(loss2, model.weights)
grads_final = grads_1 + grads_2
opt.apply_gradients(zip(grads_final, model.weights))

I tested both options on Tensorflow 2.5 and they gave me similar results.