I'm trying to use a conditional asymmetric loss function with a regression model and am having issues. I want to penalize wrong way results but the direction flips depending on the sign of the variable.
import numpy as np
def CustomLoss(predict,true):
ix = np.logical_and((predict*true)>0,np.abs(true)>=np.abs(predict))
n = ((predict - true)**2)*2
y = (predict-true)**2
out = np.where(ix,y,n)
return out
# CustomLoss(1,3) = 4
# CustomLoss(1,-1) = 8 ## Bigger loss for wrong way result
# CustomLoss(-2,-4) = 4
# CustomLoss(-2, 0) = 8 ## Bigger loss for wrong way result
I tried using scipy optimize, it converges for some data but not others. The function is still convex so I'd think this should always converge.
I've typically used CVXPY but can't figure out how to implement the conditional part of the cost function.