Recursive Feature Elimination with Cross Validation (RFEVC) does not work on the Multi Layer Perceptron estimator (along with several other classifiers).
I wish to use a feature selection across many classifiers that performs cross validation to verify its feature selection. Any suggestions?
There is a feature selection independent of the model choice for structured data, it is called Permutation Importance. It is well explained here and elsewhere. You should have a look at it. It is currently being implemented in sklearn.
There is no current implementation for MLP, but one could be easily done with something like this (from the article):
Note that here the training set is used for computing the feature importances, but you could choose to use the test set, as discussed here.