Lazy learning algorithms AI

Lazy learning is a category of machine learning that does not utilize training data until and unless a prediction needs to be made. Rather than creating a general model, they memorize the training data. Upon receiving a new query, they derive a prediction from similar instances from the training dataset.

Lazy learning algorithms perform well when the data distribution is complex or noisy. An example of a lazy learning algorithm is the K-nearest neighbor (K-NN) algorithm. K-NN predicts based on the k closest training data points to the query point. The class label of the query point is decided based on the class labels of the k closest neighbors to the query point.

Lazy learning algorithms

Let’s take a look at the following lazy learning algorithm.

K nearest neighbors (K-NN)

K-nearest neighbors (K-NN): K-NN is a type of instance-based learning and falls under the supervised learning domain in which we assign a label to the new data point based on how close it lies to its k neighbors. Look at the diagram below; the new data point is assigned Category 1 because of a greater proximity to Category 1 than Category 2. We can measure this proximity with a distance measure like Euclidean or Manhattan distance.

K Nearest Neighbors
K Nearest Neighbors

Difference between lazy learning and eager learning

Lazy learning stores the training data by heart to use it while predicting new data labels. However, eager learning takes the training data to generate a classification layer before getting the test data. Thus, eager learning algorithms have a high training time over consultation time, while the opposite is true for lazy learning algorithms.

However, in eager learning, the system utilizes a single hypothesis encompassing the entire instance space; lazy learning uses a bigger hypothesis space.

Importance of lazy learning

The purpose of using lazy learning algorithms is to ensure that the dataset is updated continuously by adding new data instances. Every time the dataset is updated, the old one becomes obsolete, leaving us no time to train our model on the new one. Lazy learning algorithms work well with large, ever-changing datasets with fewer attributes that are constantly queried. Thus, they adapt quickly to new data. Moreover, they are less affected by outliers than eager learning algorithms and easily handle complex data distributions and non-linear relationships.

Test yourself

Before moving on to the conclusion, test your understanding.

1

What is the main characteristic of lazy learning algorithms?

A)

They build a general model before making predictions.

B)

They utilize training data only when a prediction needs to be made.

C)

They have a high training time and low consultation time.

Question 1 of 20 attempted

In conclusion, lazy learning algorithms, such as K-nearest neighbors (K-NN), offer a flexible and adaptive approach to machine learning, particularly suited for complex, noisy, and continuously evolving datasets.

Copyright ©2024 Educative, Inc. All rights reserved