How many partitions should be used for a latin hypercube sample, versus computational time

33 views Asked by At

I'm running a sensitivity analysis on an epidemiological model and I have 14 parameters which are uncertain.

I'm using the maximinLHS command from the 'lhs' package in R (Version 2023.06.1+524), and running it on a 2020 Macbook Pro 2 GHz Quad-Core Intel Core i5.

#set number of partitions/simulations
h <- 50000

#set number of uncertain parameters
num_params <- 14

#draw a Latin hypercube sample from a set of uniform distributions
lhs<-maximinLHS(h,num_params)

I can draw a LHS within an hour or so with 10,000 partitions, but my supervisor reckons that 10,000 points in a 14 dimensional space is too sparse. So he wants me to up it to 100,000. Looking at the literature, I don't see many papers using this many partitions, so I'm giving 50,000 a go, but even this has now been running for 2 days without completing.

So my questions are...

  1. How do you know the right number of partitions for the number of parameters you are exploring?
  2. Are there any arguments within the maximinLHS command which might optimise this and speed it up?
  3. Is there any way to predict how long this is going to take?

(and yes I know I need a much more powerful computer to be running this sort of thing!)

Thanks

0

There are 0 answers