I am wondering whether there exists some correlation among the hyperparameters of two different classifiers.
For example: let us say that we run LogisticRegression
on a dataset with best hyperparameters (by finding through GridSearch
) and want to run another classifier like SVC
(SVM
classifier) on the same dataset but instead of finding all hyperparameters using GridSearch
, can we fix some values (or reduce range to limit the search space for GridSearch
) of hyperparameters?
As an experimentation, I used scikit-learn
's classifiers like LogisticRegression
, SVS
, LinearSVC
, SGDClassifier
and Perceptron
to classifiy some well know datasets. In some cases, I am able to see some correlation empirically, but not always for all datasets.
So please help me to clear this point.
I don't think you can correlated different parameters of different classifiers together like this. This is mainly because each classifier behaves differently as it has it's own way of adjusting the data along their own set of equations. For example, take the case of
SVC
with two different kernelsrbf
andsigmoid
. It might be the case thatrbf
may fit perfectly over the data with theintercept parameter C
set to say 0.001, while 'sigmoidkernel over the same data may fit with
C` value 0.00001. Both values may also be equal. However, you can never say that for sure. When you say that :It may simply be a coincidence. Since it all depends on the and the classifiers. You cannot apply it globally.Correlation does not always equal to causation
You can visit this site and see for yourself that although different regressor functions have the same parameter
a
, their equations are vastly different and hence over the same dataset you might drastically different values ofa
.