Decision tree and random forest modules allow us to retrieve the importance of individual features after training by using the feature_importances_ attribute of a trained model instance. The higher the value is, the more important the respective feature will be. We must grab the feature importance from the individual trees and compute the mean for bagged decision trees.

Feature importance: Single decision tree

In a DecisionTreeClassifier, the importance of a feature is computed as the (normalized) total reduction of the criterion brought by that feature. It is also known as the Gini importance (for default case).

Get hands-on with 1200+ tech skills courses.