If i can compute the gradient and Hessian, will Newtons method significantly outperform BFGS/L-BFGS?

77 views Asked by At

I have 3-parameter estimation problem (so dimension is low and memory not a problem) where the objective function and gradients+hessians are slow to evaluate, as it is a result of a Monte Carlo simulation. However, my code is implemented so that I can get the gradient + hessian using Automatic differentiation, which should be better than the finite difference based appoximation.

As such I want to do as few iterations, aka. function and gradient/hessian evaluations as possible to reach optimum. Will then Newton-Raphson method be the way to go, with an AD generated hessian?

0

There are 0 answers