Estimation of AR Models
Oh, Hyunzi. (email: wisdom302@naver.com)
Korea University, Graduate School of Economics.
Main References
Consider an
Proof.Since the model does not contain any constant term, we can derive the autocovariance function directly. First, by multiplying
Consider an
Then, using Geometry of Least Squares Estimator > ^6f53a3Geometry of Least Squares Estimator > Proposition 3 (Ordinary Least Squares estimator of
Note that in matrix form, we have
The proof is identical to Asymptotic Results in Basic Linear Model > ^2f0b7fAsymptotic Results in Basic Linear Model > Theorem 3 (consistency of least-square estimator). From
The detailed explanation for the asymptotic theory will be discussed in Asymptotic Theory on Time Series ModelsAsymptotic Theory on Time Series Models later on.
Let
If the joint distribution of
Consider an
Using the definition of conditional density, we have
Now consider for the full MLE, the previously defined as
Note that for
Consider an
Then, for
Before moving on, we introduce some useful theorem:
Let
Note that this theorem is a random variable version for Statistical Proof > ^d5a499Statistical Proof > Theorem 7 (pdf of invertible function of a continuous random vector).
Let the value of the initial errors to zero, i.e.
Now we consider the model that incorporates
By letting
Remark that
For an
Conditional on the initial observations
Note that the Maximum Likelihood Estimation > ^33437dMaximum Likelihood Estimation > Theorem 8 (asymptotic normality of MLE) is still in effect and the standard errors of the parameter estimates can be obtained from the Hessian matrix.
While the goodness of fit is naturally used for the order-selecting criteria, in
Consider an
The information criterion for each
The two most popular criteria are given as follows:
The value of
First, suppose
In case of AIC,
Now suppose
Therefore, we have