# Model Compilation

Study another step, i.e.,Keras' model compilation.

## The `compile`

method

After the model architecture is set, the model can be compiled using the `compile`

method. This sets up the network for optimization. It creates an internal function to perform backpropagation efficiently.

## Arguments for the `compile`

method

The compile method has two arguments:

### 1. Optimizer

The optimizer helps specify the learning rate. The learning rate is important since it will help the model quickly find a good set of weights.

π Note: There are a few optimizers that automatically tune the learning rate. We do not need to know the details of each optimizer. However, we need to choose an optimizer that serves as a good example for many problems.

`Adam`

(**adaptive moment estimation**) is a good choice. It adjusts the learning rate during gradient descent and ensures reasonable values for the weights during the weight optimization process.

### 2. Loss function

The loss function specifies the loss during model training.

#### Regression

Mean squared error is the most common choice for *regression problems*.

Get hands-on with 1200+ tech skills courses.