Sklearn MLP Feature Selection

4.2k views Asked by At

Recursive Feature Elimination with Cross Validation (RFEVC) does not work on the Multi Layer Perceptron estimator (along with several other classifiers).

I wish to use a feature selection across many classifiers that performs cross validation to verify its feature selection. Any suggestions?

1

There are 1 answers

0
H4dr1en On

There is a feature selection independent of the model choice for structured data, it is called Permutation Importance. It is well explained here and elsewhere. You should have a look at it. It is currently being implemented in sklearn.

There is no current implementation for MLP, but one could be easily done with something like this (from the article):

def permutation_importances(rf, X_train, y_train, metric): 
    baseline = metric(rf, X_train, y_train)
    imp = []
    for col in X_train.columns:
        save = X_train[col].copy()
        X_train[col] = np.random.permutation(X_train[col])
        m = metric(rf, X_train, y_train)
        X_train[col] = save
        imp.append(baseline - m)
    return np.array(imp)

Note that here the training set is used for computing the feature importances, but you could choose to use the test set, as discussed here.