# OLS Model Coefficients

Let's learn estimation and interpretation of ordinary least squares (OLS) model coefficients.

We'll cover the following

What technique should we use to estimate the sample regression model? As noted earlier, we use OLS to find a set of coefficient estimates that minimizes the residual sum of squares. The specific OLS estimator for the vector of coefficients in the sample model, denoted as $B$ containing $a$, $b$, $c1$, and $c2$, can be expressed as follows:

$B=(X'X)^{-1}X'Y=(\sum{x_ix'_i})^{-1}(\sum{x_iy_i})$

The essence of the formula is that it allows us to find the set of $B$ ( $a$, $b$, $c1$, and $c2$) that minimizes the residual sum of squares. The R code for obtaining the OLS coefficient estimates and related output is straightforward. We begin with the lm() function, which represents a linear model. The argument inside the function includes the dependent variable, followed by a tilde and the independent variables connected by plus sign, and the data= option for specifying the dataset. Once the model is estimated, the output from the lm() function is assigned to some object, which we refer to as model1. To demonstrate the content of the model output, we apply the summary() function to model1.

Get hands-on with 1200+ tech skills courses.