SciPy: leastsq vs least_squares

6k views Asked by At

SciPy provides two functions for nonlinear least squares problems:

optimize.leastsq() uses the Levenberg-Marquardt algorithm only.

optimize.least_squares() allows us to choose the Levenberg-Marquardt, Trust Region Reflective, or Trust Region Dogleg algorithm.

Should we always use least_squares() instead of leastsq()?

If so, what purpose does the latter serve?

1

There are 1 answers

1
AudioBubble On BEST ANSWER

Short answer

Should we always use least_squares() instead of leastsq()?

Yes.

If so, what purpose does the latter serve?

Backward compatibility.

Explanation

The least_squares function is new in 0.17.1. Its documentation refers to leastsq as

A legacy wrapper for the MINPACK implementation of the Levenberg-Marquadt algorithm.

The original commit introducing least_squares actually called leastsq when the method was chosen to be 'lm'. But the contributor (Nikolay Mayorov) then decided that

least_squares might feel more solid and homogeneous if I write a new wrapper to MINPACK functions, instead of calling leastsq.

and so he did. So, leastsq is no longer required by least_squares, but I'd expect it to be kept at least for a while, to avoid breaking old code.