GARCHM_CALIBRATE - Optimal Values for Model's Parameters

Computes the maximum likelihood estimate (MLE) of the model's parameters.

Syntax

GARCHM_CALIBRATE(X, Order, Model, Mask, Method, maxIter)
X
is the univariate time series data (a one dimensional array of cells (e.g. rows or columns)).
Order
is the time order in the data series (i.e. the first data point's corresponding date (earliest date=1 (default), latest date=0)).
Order Description
1 ascending (the first data point corresponds to the earliest date) (default)
0 descending (the first data point corresponds to the latest date)
Model
is the GARCH-M model representation array (a one dimensional array of cells (e.g. rows or columns)) (see GARCHM function).
Mask
is an array of 0's and 1's to specify which parameters to calibrate for. If missing, all parameters are included in the calibration.
Method
is the calibration/fitting method (1=MLE, 2=Bayesian). If missing, a Maximum Likelihood Estimate (MLE) is assumed.
Method Description
1 Maximum Likelihood Estimate (MLE)
2 Bayesian
maxIter
is the maximum number of iterations used to calibrate the model. If missing, the default maximum of 100 is assumed.

Remarks

  1. The underlying model is described here.
  2. The time series is homogeneous or equally spaced.
  3. The time series may include missing values (e.g. #N/A) at either end.
  4. The maximum likelihood estimation (MLE) is a statistical method for fitting a model to the data and provides estimates for the model's parameters.

 

Files Examples

References

Comments

Article is closed for comments.

Was this article helpful?
0 out of 0 found this helpful