1- Decision Trees

This lesson will go over decision trees in more detail and provide the steps involved in their implementation.

Introduction to decision trees

DTDecision Trees creates a decision structure to interpret patterns by splitting data into groups using variables that best split the data into homogenous or numerically relevant groups based on entropy (a measure of variance in the data among different classes). The primary appeal of decision trees is that they can be displayed graphically as a tree-like graph, and they’re easy to explain to non-experts.

Unlike an actual tree, the decision tree is displayed upside down with the “leaves” located at the bottom, or foot, of the tree. Each branch represents the outcome of a decision or variable, and each leaf node represents a class label such as “Go to the beach” or “Stay in.” Decision rules are subsequently marked by the path from the tree’s root to a terminal leaf node.

Get hands-on with 1200+ tech skills courses.