I am currently working on finding polynomial equations to describe data sets (simple tables with one x and one y value) in Python. I need a polynomial equation to describe the data set so that given a number x I can derive it's y value.
If I enter these data sets into Apple's Numbers app and plot them on a graph, I can get a very helpful polynomial equation that predicts my y values very accurately for my purposes. However, when using Numpy to derive a polynomial equation with the same data set, I am given a very different polynomial equation that makes incredibly inaccurate predictions. I want my Python program to create polynomial equations that are much closer to the ones produced by the numbers app.
Some specifics:
My x values (x_list): 3.75652173913043, 3.79130434782609, 3.82608695652174
My y values (y_list): 0.0872881944444445, 0.0872522935779816, 0.0858840909090909
My polynomial from Numbers: y = -0.5506x^2 + 4.1549x - 7.7508
My polynomial from Numpy: y = -7.586x^2 + 57.53x - 108.7
How I'm using Numpy: polynomial = numpy.poly1d(numpy.polyfit(x_list, y_list, deg=2))
My numbers in Python are rounded to 8 decimal points, but I think the discrepancy is too large for it to be a rounding issue.
In short, I am wondering how Numbers would have derived this polynomial vs. how Numpy would have, and how I can replicate the Numbers method, ideally without using Numpy. (I am going to have to translate my program from Python to Swift eventually.)
The results I'm getting with this code are accurate. Maybe if you can post your code we can provide more help.