# Appendix E: Hannan-Quinn Information Criterion (HQC)

The Hannan-Quinn information criterion (HQC) is a measure of the goodness of fit of a statistical model, and is often used as a criterion for model selection among a finite set of models. It is not based on log-likelihood function (LLF), and but related to Akaike's information criterion.

Similar to AIC, the HQC introduces a penalty term for the number of parameters in the model, but the penalty is larger than one in the AIC.

1. In general, the BIC is defined as: $$HQC=n \times \ln{\frac{RSS}{n}} +2\times k \times \ln(\ln n)$$ Where:
• $n$ is the number of observations.
• $k$ is the number of model parameters.
• $RSS$ is the residual sum of squares that result from the statistical model.
2. Given any two estimated models, the model with the lower value of HQC is preferred; a lower HQC implies either fewer explanatory variables, better fit, or both.

Remarks
1. HQC, like BIC, but unlike AIC, is not asymptotically efficient.
2. It is important to keep in mind that the HQC can be used to compare estimated models only when the numerical values of the dependent variable are identical for all estimates being compared.
3. BIC has been widely used for model identification in time series and linear regression. It can, however, be applied quite widely to any set of maximum likelihood-based models.