I have to deal with highly unbalanced data. As I understand, I need to use weighted cross entropy loss.
I tried this:
import tensorflow as tf
weights = np.array([<values>])
def loss(y_true, y_pred):
# weights.shape = (63,)
# y_true.shape = (64, 63)
# y_pred.shape = (64, 63)
return tf.reduce_mean(tf.nn.weighted_cross_entropy_with_logits(y_true, y_pred, weights))
model.compile('adam', loss=loss, metrics=['acc'])
But there's an error:
ValueError: Creating variables on a non-first call to a function decorated with tf.function
How can I create this kind of loss?
I suggest in the first instance to resort to using
class_weight
from Keras.is a dictionary with
{label:weight}
For example, if you have 20 times more examples in label 1 than in label 0, then you can write
In this way you don't need to worry implementing weighted CCE on your own.
Additional note : in your
model.compile()
do not forget to useweighted_metrics=['accuracy']
in order to have a relevant reflection of your accuracy.