Usually, when approaching to LASSO, the best hyperparameter lambda is assumed to be lambda.1se (in the glmnet package), which is the lambda minimizing/maxining the CV-metric (usually AUC, Accuracy or Deviance) PLUS 1 standard deviation. lamnda.1se = lambda.min + sd(lambda.min)
In this way there is a stronger penalization and the model is less prone to overfitting.
However, in caret, I always find the assumption that the best model is the corresponding to the min/max of the CV-metric. Is it wise?
Is there a way to ask caret train function to select the model corresponding to lambda.1se?
Thank you.