Polynomial curve fitting

664 views Asked by At

I want to implement polynomial curve-fitting using the least-squares technique but with various error functions i.e. not just least squares. Is there some way to do that in MATLAB? (I want to compare the results for different error functions. I also want to use regularisation for which I need to change the error function).

Can you share any resources (MATLAB/C++) that could provide some help on how to implement curve fitting without in-built function? I could only find those using gaussion elimination - is that the same as least squares fitting?

3

There are 3 answers

0
thb On

Gaussian elimination is not the same as least-squares fitting. The sense in which it is not the same as least-squares fitting resembles the sense in which gasoline is not the same as driving.

Gaussian elimination is a technique to solve a linear system. Least-squares solves a linear system and does some other things, so it can use Gaussian elimination.

In general, as far as I know, least-squares fitting in the generalized Moore-Penrose sense (see sect. 13.6 here; caution, heavy reading) is the canonical linear way to fit parameters. If you wish to use an unrelated error function, then you will have either (a) to depart from matrix techniques or (b) use less efficient iterative matrix techniques which do not approach the power of Moore-Penrose.

I realize that this is probably not the answer you wanted, but I believe that it is the answer. If you find out differently, let us know.

0
Abul KM Rajib Hasan On

polynomial curve fitting is the first step towards learning "machine learning". My advise is to try least square first and then understand the probabilistic treatment of curve fitting. You can find this in (Bishop's Book). The summary is, you can assume that target value(t) for an input value (x) comes from a gaussian distribution. So error can be minimized by taking the maximum likelihood of the target value. This looks easy at the begining but the intuitive meaning has many insights. I would recommend you try this using matlab or r.

0
Audrius Meškauskas On

There is an open source implementation of the polynomial regression in C++ in GitHub under MIT license here. It supports standard STL containers for input, separate types for data and calculations (feed and read uint8_t while running the 64th degree regression with __float128), can differentiate, integrate and compute residuals. It is not complex to use:

  std::vector<float> x, y; // provide data
  auto polynomial = polynomial_regression<2>(x, y);  // second degree polynomial

  // Interpolate at 0.5:
  std::cout << "f(0.5) = " << polynomial(0.5) << std::endl;

  // Coefficients:
  for (auto a: polynomial) {
    std::cout << a << std::endl;
  }