I'm updating this page...

A few notes I have relating to **linear regression models**:

Standard error of the beta coefficients tell us the how much the beta would vary if you were to apply the same regression analysis to many samples from the same population \(SE = \cfrac{SD}{\sqrt{n}}\)

.
The degrees of freedom associated with the residual standard error are the number of cases in the model minus the number of predictors (including the intercept). (Enter formulas here :D) The residual standard error is square root of the sum of the squared residuals of the model divided by the degrees of freedom.
You might hear a term called, *leverage*, which is a measure that indicates how strongly a data point affects the accuracy of the regression. Leverage values range between 0 (no influence) and 1 (strong influence: suboptimal!).
*Principle of parsimony* (Occam’s razor): explanations that require fewer assumptions are more likely to be true. Model fitting should base on this.