Algorithms for minimizing single-variable functions

604 views Asked by At

Given a continuous, convex single-variable function that I want to minimize over a bounded interval [a,b], what options do I have? I have access to the numerical derivative, but not the analytic derivative.

This is done inside a loop that will be run an arbitrarily large number of times so it really needs to be as quick as possible. Bisection is elegant and simple, but I suspect you're missing out on efficiency by not utilizing convexity and slopes.

1

There are 1 answers

2
Ami Tavory On BEST ANSWER

For this setting, I'd go with Golden Section Search.

  • Convexity implies unimodality, which is needed by this method.

  • Conversely, this method does not need derivatives. You can find derivatives numerically, but that's another way of saying "multiple function assessments"; might as well use these for golden-section partitions.