I have a multivariable function that I wish to minimize. The function has two input arguments, a vector c and a scalar \theta.
Using fmincon in MATLAB to solve the optimization problem for both c and \theta is complicated because certain values of \theta causes numerical errors. However, fixing theta, c can be easily obtained via fmincon without any errors.
So the plan now is to do a brute force approach, i.e. compute c for each value of \theta in the range 1:100 (although the true constraint for \theta is \theta \ge 0) and choose \theta (and the corresponding c) for which the objective value is minimized simply by plugging the estimated parameters back to the objective function.
Now this doesn't sound very efficient to me and I'm wondering if I can employ a bisection method-esque approach so that I would not have to go over all possible values of \theta in the range specified above.
Thanks a lot!
Bisection search over theta will only work if the objective function is convex (or quasiconvex) in theta. Otherwise, you risk finding a local min instead of a global min.
Doing a nested fmincon, as @chipaudette suggests, should work if you choose a solver capable of solving nonconvex optimization problems. (The MATLAB help on this topic is a little vague, but I think the SQP solver should be OK.) But I suspect it will be more efficient just to enumerate over the relevant range of theta, rather than using fmincon for it.