Simple Linear Regression (SLR) Model

In statistics, simple linear regression is the least squares estimator of a linear regression model with a single explanatory variable. In other words, simple linear regression fits a straight line through the set of n points in such a way that makes the sum of squared residuals of the model (that is, vertical distances between the points of the data set and the fitted line) as small as possible.

For the SLR, the objective is to find a straight line which provide the best fit for the data points $\left(x_i,y_i\right)$
$$y = \alpha + \beta \times x$$
Where

  • $\alpha$ is the constant (aka intercept) of the regression.
  • $\beta$ is the coefficient (aka slope) of the explanatory variable.


Remarks

  1. The number of rows in response variable (Y) must be equal to number of rows of the explanatory variables (X).
  2. By definition, the regression line goes through the center of mass $\left( \bar X, \bar Y \right )$

Files Examples

Related Links

References

  • Hamilton, J .D.; Time Series Analysis , Princeton University Press (1994), ISBN 0-691-04289-6
  • Kenney, J. F. and Keeping, E. S. (1962) "Linear Regression and Correlation." Ch. 15 in Mathematics of Statistics, Pt. 1, 3rd ed. Princeton, NJ: Van Nostrand, pp. 252-285

Comments

Article is closed for comments.

Was this article helpful?
0 out of 0 found this helpful