The Bayesian information criterion (BIC) or Schwarz criterion (SIC) is a measure of the goodness of fit of a statistical model. It is often used as a criterion for model selection among a finite set of models. It is based on the log-likelihood function (LLF) and closely related to Akaike's information criterion.
Similar to AIC, the BIC introduces a penalty term for the number of parameters in the model, but the penalty is larger than one in the AIC.
- In general, the BIC is defined as:
$$\mathit{BIC}=k\times\ln{n} -2\times\ln(L)$$
Where:
- $k$ is the number of model parameters.
- $\ln(L)$ is the log-likelihood function for the statistical model.
- Given any two estimated models, the model with the lower value of BIC is preferred; a lower BIC implies either fewer explanatory variables, better fit, or both.
Remarks
- It is essential to remember that the BIC can be used to compare estimated models only when the numerical values of the dependent variable are identical for all estimates being compared.
- BIC has been widely used for model identification in time series and linear regression. It can, however, be applied quite widely to any set of maximum likelihood-based models.
Related Links
References
- D. S.G. Pollock; Handbook of Time Series Analysis, Signal Processing, and Dynamics; Academic Press; Har/Cdr edition(Nov 17, 1999), ISBN: 125609906
- James Douglas Hamilton; Time Series Analysis; Princeton University Press; 1st edition(Jan 11, 1994), ISBN: 691042896
- Tsay, Ruey S.; Analysis of Financial Time Series; John Wiley & SONS; 2nd edition(Aug 30, 2005), ISBN: 0-471-690740
- Box, Jenkins and Reisel; Time Series Analysis: Forecasting and Control; John Wiley & SONS.; 4th edition(Jun 30, 2008), ISBN: 470272848
- Walter Enders; Applied Econometric Time Series; Wiley; 4th edition(Nov 03, 2014), ISBN: 1118808568
Comments
Article is closed for comments.