Define own loss and classify error in cntk

183 views Asked by At

I understand people typically use the loss and error below to train

ce = cross_entropy_with_softmax(z, label_var) 
pe = classification_error(z, label_var) 
trainer = Trainer(z, (ce, pe), ...)

Can we override or define own loss and error methods? What we really need is to add weights when we calculate loss and error. For instance, we have 4 classes, it is important(more weight) to not misclassify the first class as other class and vice versa, but not as important(less weight) if it misclassifies among the last 3 classes. What is the best way to handle that in cntk?

1

There are 1 answers

2
Emad Barsoum On

Yes, any CNTK expression is a valid loss or error: Here cross entropy:

ce = C.negate(C.reduce_sum(C.element_times(target, C.log(prediction)), axis=-1))