Power law fit in Python

1.7k views Asked by At

I have two columns of data, x and y. I am interested to do power law fit of the form: y=a+b*x^c. Are there packages available in Python which does it?

2

There are 2 answers

0
Mickey Diamant On

Start by defining a model function

def power(x, a, b, c):
    return a + b * x ** c

We can use the curve_fit function from scipy to find the values for a, b, and c which give the best fit.

popt, pcov = curve_fit(power, x, y, p0=initial_guess)

'popt' is a list of the values for a, b, and c which gives the best fit (notice that there is no guarantee a solution exists or that the one you get is the optimal one).

'pcov' is a matrix. The square root of its diagonal is a measure of the uncertainty of the solution.

'initial guess' is a list of three numbers that serve as initial values for a, b, and c in the first iteration of the fitting algorithm. You must provide the values.

Another keyword that you can pass to curve_fit is 'bounds'. This serves to limit the range the algorithm searches for values for the parameters a, b, and c. 'bounds' can take a tuple of two lists. The first list are the bottom limits for each parameter, while the second list is for the top bounds.

Here is an example.

x = [  2.5,   5. ,  10. ,  20. ,  40. , 160. ]
y = [  82933.97231252,  195697.01890895,  411424.83454658,
    689365.13501311, 1100703.42291616, 3069058.97376933]
initial_guess = [1e4, 1e4, 0.5]
popt, pcov = curve_fit(power, x, y, p0=initial_guess)

You should get

popt = array([-8.38814011e+04,  9.58024081e+04,  6.88182131e-01])
pcov = array([[ 1.49036273e+09, -4.72459139e+08,  8.87583051e+02],
             [-4.72459139e+08,  1.72727658e+08, -3.30764877e+02],
             [ 8.87583051e+02, -3.30764877e+02,  6.37612550e-04]])
fig, ax = plt.subplots()
ax.plot(x,y, ls='None', marker='.', ms=10)
X = np.linspace(0, 200, 201)
ax.plot(X, power(X, *popt))

Blue circles are the original data, and orange line is the fit

0
denis On

The following fits two parameters with scipy least_squares (which is really nice, really powerful); you can easily modify it for a + b x^c --

import numpy as np
from scipy.optimize import least_squares
    # least_squares(fun, x0, jac='2-point', bounds=(-inf, inf), method='trf',
    #      ftol=1e-08, xtol=1e-08, gtol=1e-08, x_scale=1.0, loss='linear',
    # https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.least_squares.html

def fitpower( y, x: np.ndarray, c0=1, p0=1, verbose=0, **kw
) -> "OptimizeResult with .x .fun ...":
    """ fit y ~ c x^p with scipy least_squares """

    def err( cp ):
        c, p = cp
        return y - c * (x ** p)

    res = least_squares( err, x0=[c0, p0], jac='3-point', verbose=verbose, **kw )
    c, p = res.x
    if verbose:
        print( "fitpower: y ~ %.3g x^%.3g \n  err %s " % (
            c, p, res.fun ))
    return res  # an OptimizeResult with .x .fun ...

scipy.optimize.curve_fit is a wrapper for least_squares or leastsq.

If you have some background in statistics, see SO how-to-properly-fit-data-to-a-power-law-in-python.