Related Tags

decision trees
regression trees
regression

# What are regression trees? Educative Answers Team

Regression Trees are decision trees that have a continuous target variable.

For example, imagine there is a newly launched product in the market whose price will depend on many constraints. This is the kind of example of where Regression Trees can be used. A regression tree is created through a process called binary recursive partitioningAn iterative process that splits the data into partitions or branches, and then continues splitting each partition into smaller groups as the method moves up each branch..

How a Regression Tree looks like.

## Features of regression trees

Root: This is the beginning of the decision tree. The first node represents the first condition based on the criteria of the data provided.

Leaf: The last node in the tree is represented by the value in the decision tree above. This is the terminal node that does not point to any condition or value.

Decision Node: Nodes after the root where any decision or condition is further divided into different categories.

Child Node: The node that is further divided into different categories is called a parent node. The nodes that result from this division are called child nodes. Features of Regression Trees

1. Visualization of data becomes easier as users can identify and process each and every step.

2. A specific decision node could be set to have a priority against other decision nodes.

3. As the regression tree progresses, undesired data will be filtered at each step. As a result, only important data is left to process, which increases the efficiency and accuracy of our design.

4. It is easy to prepare regression trees – they can be used to present data during meetings, presentations, etc.

Let’s look at one basic example of a regression tree that plots data on the salary of a company’s employees based on their position. Basic example of Regression Tree

RELATED TAGS

decision trees
regression trees
regression 