I have a function of four input variables which I am trying to minimize using the Levenburg-Marquardt optimization method. The previous results where the Hessian/Gradient is calculated using Forward Difference Approximation wasn't accurate enough so, I wanted to add the Hessian/Gradient as a callable argument to the least_squares() method. This is what I have tried --

Using Sympy, I calculated the gradient and the Hessian using

gradient_vec = [diff(obj_func, var) for var in (x1, x2, y1, y2)]
hessian_mat = [[obj_func.diff(var1).diff(var2) for var1 in list((x1, x2, y1, y2))] for var2 in list((x1, x2, y1, y2))]
grad_func = lambdify([x1, x2, y1, y2, f], gradient_vec, 'numpy')
hess_matr_func = lambdify([x1, x2, y1, y2, f], hessian_mat, 'numpy')

where f is an additional argument to both the gradient and hessian functions. In my leastsq function call I have (my objective function has only one input),

result = leastsq(obj_fun, x0=np.random.uniform(size=(4,)), Dfun=grad_func, args=(f,))

I run this and I keep getting this error

TypeError: obj_fun() takes 1 positional argument but 2 were given

So, I tried the least_squares() function with method='lm' argument and when I pass the Hessian as,

result = least_squares(obj_fun, x0=np.random.uniform(size=(4,), method='lm', jac=hess_matr_func, args=(f,))

And I still get the same error. How do I pass an argument *args but to the Gradient/Hessian callables alone? I tried using the functools.partial to create a wrapper around the callable function and even that didn't help. Thanks very much for your help!

1 Answers

0
ev-br On

I don't think you can have different arguments for the function and its derivatives.

One way around could be to store that extra argument as an attribute (python functions can have attributes, too). Or create a class with a single-argument method and store that extra attribute on the instance.