I am trying to implement a system of ODEs using the R package deSolve and compiled C code. I am using the recommendations provided with the following vignette https://cran.r-project.org/web/packages/deSolve/vignettes/compiledCode.pdf. In my model, I have one parameter for which I want to allocate the size dynamically. The question was almost addressed in a previous thread How to implement a system in deSolve, R, with N equations a N+m parameters?. More specifically I would like to declare the total size of the parameter vector as a parameter and not as a fixed value like it is done in the previous thread. Unfortunately, the compiler is not that happy with this option and I would like to know whether someone has any idea or hint on how to solve it.
For the example let me republish the one kindly published by @tpetzoldt in the previous thread and slightly modified to highlight my issue.
#include <R.h>
#define nA 6
#define nT 9
union parvec {
struct {
double r[3], a[nA];
};
double value[nT];
} p;
/* initializer */
void initmod(void (* odeparms)(int *, double *))
{
int N = nT; /* total number of parameters */
odeparms(&N, p.value);
}
/* Derivatives */
void derivs (int *neq, double *t, double *y, double *ydot,
double *yout, int *ip) {
double y_sum = 0;
for (int i = 0; i < *neq; i++) {
y_sum = 0;
for (int j = 0; j < *neq; j++) y_sum += p.a[i + *neq * j] * y[j];
ydot[i] = p.r[i] * y[i] * (1 - y_sum);
}
}
# file call_model.R
library(deSolve)
system("R CMD SHLIB model.c")
dyn.load("model.dll")
p <- c(r = c(0.1, 0.3, 0.04), A = c(0.2, 0.3, 0.3, 0.5, 0.4, 0.2))
y <- c(X = c(2, 2, 2))
times <- seq(0, 200, by = 0.1)
out <- ode(y, times, func = "derivs", parms = p,
dllname = "model", initfunc = "initmod")
matplot.0D(out)
dyn.unload("model.dll")
I have looked at the vignette and specifically the section 3. Alternative way of passing parameters and data in compiled code but did succeed in solving my problem.