# Parameters

Learn about machine learning models' parameters, hyperparameters, and their importance.

We'll cover the following

## Parametric model

Parametric models are functions defined by a fixed set of parameters in machine learning. These are assumed to be able to approximate the underlying pattern of the data. By adjusting the values of the parameters during the training process, these models can learn to fit the data and make accurate predictions on new inputs. Consider this function: $f(x_1,x_2)=2x_1-3x_2+7$ with inputs $x_1$ and $x_2$. This function is an instance of a class of functions of the form $f_{w_1,w_2,w_0}(x_1,x_2)=w_1x_1+w_2x_2+w_0$, where $w_1=2,w_2=-3$, and $w_0=7$. Here, $w_1,w_2$, and $w_0$ are the parameters, also known as weights, and any choice of these parameters results in an instance from the function class. Notice that the function $f(x_1,x_2)=2x_1-3x_2+7$ is also an instance of the following different function class:

$f_{w_1,w_2,w_3,w_4,w_0}(x_1,x_2)=w_1x_1+w_2x_2+w_3x_1^2+w_4x_2^2+w_0$

where $w_1=2,w_2=-3,w_3=0,w_4=0$, and $w_0=7$

### Examples

• $f_{m,c}(x)=mx+c$ is a function class parametrized by $m$ and $c$, and it represents all the lines in 2D.
• $f_{w_1,w_2,w_3,\dots,w_n}(x_1,x_2,x_3,\dots,x_n)=\sum_{i=1}^nw_ix_i$ is a class of functions that represents the weighted average of a set of objects $x_i$. In this case, all the parameters $w_i$ are non-negative and add up to $1$.

Note: In general if $\bold x$ represents the input and $\bold w$ represents the set of parameters, then $f_{\bold w}(\bold x)$ is the representation of the parametric model.

Let’s take a simple example of a line in 2D as a parametric model and implement it. Try to visualize different instances of this class by changing parameters m and c in line 15 of the code.

Get hands-on with 1200+ tech skills courses.