Search⌘ K

Feature Scaling

Explore feature scaling methods including normalization and standardization to understand how they impact the performance of machine learning algorithms. This lesson helps you grasp why scaling is essential for gradient descent and distance-based models.

We'll cover the following...

Feature scaling

Feature scaling comes under Feature Engineering. Feature scaling refers to the process of normalizing the features, or columns, or dimensions. Many Machine Learning algorithms are sensitive to the scale or magnitude of the features.

Benefits

It has the following benefits

  • It helps in gradient descent based algorithms to converge faster.

  • It helps in distance-based algorithms to give equal weight to each feature while computing the similarity.

  • It helps to compare the Feature Importance.

Distance-Based Algorithms take into account the distance or similarity between instances of the dataset to do the computation.

Types of feature scaling

There are these two famous types of feature scaling.

  1. Normalization: It involves rescaling the values of features in the range between 0 and 1. Normalization is a good technique to use if someone doesn’t know the distribution of the input columns or the distribution is not Gaussian. The formula for it is:

X=XXminXmaxXminX' = \frac{X - X_{min}}{X_{max} - X_{min}} ...