Explainable AI (XAI) : Permutation Importance

54 views Asked by At

I was using the model Permutation Importance for checking the Explainability (XAI) of my ML model. The XAI model i.e., Permutation Importance is supposed to show the Weights against all the Features of the dataset. Now the problem is, the XAL model is not showing all the feature names in the output. And I need the output with all the features.

Can anyone kindly help me to show the output with the names of all the features? I have used the following code:

import eli5
from eli5.sklearn import PermutationImportance

perm = PermutationImportance(rfc, random_state=1).fit(X_test, Y_test)
eli5.show_weights(perm, feature_names = X_train.columns.tolist())

I tried searching for the solution where I can mention the number of features specifically. But didn't find any answer.

0

There are 0 answers