Search⌘ K
AI Features

Simple Linear Regression for a Numerical Explanatory Variable

Explore the fundamentals of simple linear regression focused on a numerical explanatory variable. Learn how to fit a linear model in R with lm(), interpret the intercept and slope coefficients, and understand their practical and statistical significance. This lesson guides you through obtaining and reading regression output tables using the moderndive package, enabling you to analyze relationships between variables effectively.

We'll cover the following...

Recall the concepts of algebra that the equation of a line is 𝑦=π‘Ž+𝑏⋅π‘₯𝑦 = π‘Ž + 𝑏 β‹… π‘₯. (Note that the β‹… symbol is equivalent to the * β€œmultiply by” mathematical symbol. We’ll use the β‹… symbol in the rest of this course as it’s more succinct.) It’s defined by two coefficients π‘Žπ‘Ž and 𝑏𝑏. The intercept coefficient π‘Žπ‘Ž is the value of 𝑦𝑦 when xx = 0. The slope coefficient 𝑏𝑏 for π‘₯π‘₯ is the increase in 𝑦𝑦 for every increase of one in π‘₯π‘₯. This is also called the rise over run.

However, when defining a regression line, we use a slightly different notation, i.e., the equation of the regression line is y^=b0+b1β‹…x\hat y = b_0 + b_1 \cdot x ...