# Model Evaluation Measures (Explained Variance Score, MAE, MSE)

In this lesson we will look at different evaluation measures for Regression Models.

We'll cover the following

## Regression Models Evaluation Metrics

Once we have built a model on the training dataset, it is time to evaluate the model on the test dataset to check how good or bad it is. It will also help us know

• If the model is overfitting
• If the model is underfitting
• If we need to revise our Feature Engineering or Feature Selection techniques.

We use the following measures to assess the performance of a Regression Model.

### Explained Variance Score

Explained Variance is one of the key measures in evaluating the Regression Models. In statistics, explained Variation Measures the proportion to which a regression model accounts for the variation (dispersion) of a given data set.

#### Formula

If $\hat{y}$ is the predicted target real valued output, then $y$ is the corresponding (correct) target real valued output, and $Var$ is Variance. Then the explained variance is estimated as follow:

$explained\_variance(\hat{y}, y) = 1 - \frac{Var(y-\hat{y})}{Var(y)}$

The best possible score is 1.0. The lower values are worse.

#### Code

These code examples have been taken from the Scikit Learn Documentation. In all the codes below:

• y_true are the actual hypothetical values.
• y_pred are the predicted hypothetical values.

Get hands-on with 1200+ tech skills courses.