Hyperparameter tuning for each iteration or once in simulation study?

22 views Asked by At

I wanna do a simulation study to compare different classification models using simulated data. I'm not sure how to set up the study correctly. I wanna compare each classifier in 500 iterations. So I generate new data from the same DGP in each iteration and then compare my classifiers for each of the generated data.

Should I tune the hyperparameters of each classifier once before the simulations and use the same configuration for each iteration and each data set or for each data set in iteration separately? I am not sure whether it is sufficient to tune the hyperparamters once as the generated data set comes from the same DGP and my main objective is to compare the classifiers. Is there anything else I am not aware of?

0

There are 0 answers