Enforcing that inputs sum to 1 and are contained in the unit interval in scikit-learn

85 views Asked by At

I have three inputs: x=(A, B, C); and an output y. It needs to be the case that A+B+C=1 and 0<=A<=1, 0<=B<=1, 0<=C<=1. I want to find the x that maximizes y. My approach is to use a regression routine in scikit-learn to train a model f on my inputs x and outputs y, and then use numpy.argmax on f to find x_best. How can I ensure that x_best=(A,B,C) sum to 1 and are all within the unit interval? Is there some special encoding I can use?

1

There are 1 answers

1
Ami Tavory On BEST ANSWER

If I understand your question correctly, then it is simply quadratic programming - all the constraints you mentioned (both equalities and inequalities) are linear.