How to globally optimize an ODE?

29 views Asked by At

Consider the following ODE

enter image description here

and the following datasets

enter image description here

How can I find α and β that best fit this data?

My approach: As an example, consider the following data

x1 = np.array([91., 110., 125., 105., 88., 84.]) 
x2 = np.array([1.0, 0.97, 1.0, 0.95, 0.92, 0.8]) 

at time points

t_list = np.array([0, 5, 9, 18, 28, 38])

Using a numerical approach (see Python script here), and a polynomial regression of x1, I get the following fit

enter image description here

with α ≃ 0.000486, β ≃ 0.057717, and a chi-square error of 0.0017296. However, this is highly dependent on the initial guesses for α and β. Is it possible to use a global optimizer to improve this? I have tried shgo, but I always get a worse fit (higher chi-square). Any ideas?

0

There are 0 answers