I'm running a sensitivity analysis on an epidemiological model and I have 14 parameters which are uncertain.
I'm using the maximinLHS command from the 'lhs' package in R (Version 2023.06.1+524), and running it on a 2020 Macbook Pro 2 GHz Quad-Core Intel Core i5.
#set number of partitions/simulations
h <- 50000
#set number of uncertain parameters
num_params <- 14
#draw a Latin hypercube sample from a set of uniform distributions
lhs<-maximinLHS(h,num_params)
I can draw a LHS within an hour or so with 10,000 partitions, but my supervisor reckons that 10,000 points in a 14 dimensional space is too sparse. So he wants me to up it to 100,000. Looking at the literature, I don't see many papers using this many partitions, so I'm giving 50,000 a go, but even this has now been running for 2 days without completing.
So my questions are...
- How do you know the right number of partitions for the number of parameters you are exploring?
- Are there any arguments within the maximinLHS command which might optimise this and speed it up?
- Is there any way to predict how long this is going to take?
(and yes I know I need a much more powerful computer to be running this sort of thing!)
Thanks