Parameter selection of SVM

395 views Asked by At

I have a dataset which I use for classifcation with libSVM in Matlab. The dataset consists of 4 classes.

For parameter selection of SVM I can do nested cross-validation. The problem is that I also need the value of the best parameters in the end.

After having done the nested cross-validation and having the final accuracy I want the values of the best parameters. Then I will train a SVM for each class (one-vs-all) with the best parameters for selecting the most important features (according to heighest weight), i.e. feature importance map.

How can I do this? Should I just not do nested cross-validation and only looping over all parameters and doing cross-validation?

Second, if I use a linear SVM then using this weight vector w for assigning importance to features works, but does it also work for non-linear SVM (e.g. rbf kernel)?

1

There are 1 answers

0
rzo1 On BEST ANSWER

To find the "best" parameters for your kernel of choice, you have to loop through all parameters to perform a so called "grid search". LIBSVM does not support a build-in grid-search mechanismn.

Regarding your second question, I would suggest to perform a feature selection (e.g. Information Gain, Mutual Information, ...) as a pre-processing step before the actual work with the SVM and in a second step take the weight vector s into consideration (but I am not sure, if this will work with RBF or Gaußian Kernels...).