I currently use scipy.optimize.minimize and scipy.optimize.leastsq to perform non-linear regression on my datasets. I would like to use PyMC(3) to investigate the posteriors for all the parameters involved in the fitting procedure. I came across this previous answer on SO.
This is a pretty good example to have available, most of the other examples I saw were for linear regressions. However, the example is not entirely suitable for my purposes. My model has a variable number of parameters, of which I would be fitting a subset. This subset would normally be in the range of 1 to 20 parameters, but sometimes more. With the scipy minimizers those varying parameters are delivered to the cost function in the form of a 1D np.ndarray, p, e.g.
def chi2(p, *args):
xdata = args[0]
return p[0] + xdata * p[1] + ........
In the link given above the @pymc.deterministic decorated gauss function has keyword arguments. This is impractical for me, as the same code block needs to deal with varying (and sizeable) numbers of parameters. Is there any way of supplying a vector of parameters instead? I would also have to supply a list of priors for each of the parameters. However, I have a list of lower and upper bounds for each parameter [(min, max)...], so that wouldn't be a problem.
For a set of parameters, it is often best to use a vector and take the dot product:
or if you want an explicit baseline mean:
(this is all PyMC 2.3)