Computes the maximum likelihood estimated (MLE) model's parameters.
GARCH_CALIBRATE(X, Order, Model, Mask, Method, maxIter)
- is the univariate time series data (a one dimensional array of cells (e.g. rows or columns)).
- is the time order in the data series (i.e. the first data point's corresponding date (earliest date=1 (default), latest date=0)).
Order Description 1 ascending (the first data point corresponds to the earliest date) (default) 0 descending (the first data point corresponds to the latest date)
- is the GARCH model representation array (a one dimensional array of cells (e.g. rows or columns)) (see GARCH function).
- is an array of 0's and 1's to specify which parameters to calibrate for. If missing, all parameters are included in the calibration.
- is the calibration/fitting method (1=MLE, 2=Bayesian). If missing, a Maximum Likelihood Estimate (MLE) is assumed.
Method Description 1 Maximum Likelihood Estimate (MLE) 2 Bayesian
- is the maximum number of iterations used to calibrate the model. If missing, the default maximum of 100 is assumed.
- The underlying model is described here.
- The time series is homogeneous or equally spaced.
- The time series may include missing values (e.g. #N/A) at either end.
- The maximum likelihood estimation (MLE) is a statistical method for fitting a model to the data and provides estimates for the model's parameters.