If two (or more) time series are integrated of same order (e.g. of order one I(1)), but their linear combination is integrated of lesser order (e.g. integrated of order zero I(0)), then these series are cointegrated.
In the vector notation: if yt is (n x 1) vector of time series (y1,t, y2,t, ... yn,t)' all integrated of order I(d) and β is (n x 1) cointegrating vector (β1, β2, ... βn)' , then if there exists linear combination zt integrated of order I(d-b), the vector of time series is cointegrated I(d,b).
In fact, there can be more than one cointegrating vectors, then we are dealing with (n x r) matrix β. There can be 0<r<n cointegrating vectors. It follows that there are n-r common stochastic trends.
One advantage of cointegration is that the problem of “spurious regression” can be avoided in this case, as the least square regression gives unbiased and superconsistent estimates.
Another advantage is that it is a long-term model (e.g. captures long-term relationship between levels of two time series y and x) in contrary to a model fitted on differenced data which would be a short-term model (e.g. captures effect of change in x on change of y). Combination of both approaches (short- and long-term models) is known as error correction model (ECM), which is self-regulating model that allows for short-term autoregressive behavior but after some time always aligns time series into equilibrium given by cointegrating vectors.
To test for cointegration and to create an error correction model, several approaches are possible, most common are the Engle-Granger's two step estimation and Johansen‘s vector autoregressive (VAR) technique. While the Engle-Granger's approach is easier to perform, it allows only one cointegrating vector at most (one has to decide which of the series is dependent and which are idependent).
No comments:
Post a Comment