Search⌘ K

Feature Selection (Intrinsic Methods)

Explore intrinsic or embedded feature selection methods that identify important features during model training. Understand how regularization techniques like Ridge, Lasso, and Elastic-Net help reduce parameters, and see how tree-based models contribute. Learn to use Scikit Learn's SelectFromModel for practical feature selection implementation.

Intrinsic or Embedded Methods

Embedded methods learn about the features that contribute the most to the model’s performance while the model is being created. You have seen Feature Selection methods in the previous lessons, and we will discuss several more in future lessons, like Decision Tree based methods.

  • Ridge Regression (L2-Regularization)

  • Lasso Regression (L1-Regularization)

  • Elastic-Net Regression (uses both L1 and L2 Regularization)

  • Decision Tree-Based Methods (Decision Tree Classification, Random Forest Classification, XgBoost Classification, LightGBM).

We know regularization reduces some of the parameters in the equation below to zero. This property of regularization methods can be used as a Feature Selection Method.

y=w0+x1w1+x2w2+x3w3...+xnwny = w_0 + x_1w_1 + x_2w_2 + x_3w_3... +x_nw_n ...