Lecture 6 Vector Autoregression(计量经济学,英文版)

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

|Ik − Φ1z − . . . − Φpzp| = 0
all lies outside the unit circle.
1.1.2 Vector moving average processes
Recall that we could invert a scalar stationary AR(p) process, φ(L)xt = t to a MA(∞) process, xt = θ(L) t, where θ(L) = φ(L)−1. The same is true for a covariance-stationary VAR(p) process,
x1,t−1 x2,t−1
+
00 0 φ22
x1,t−2 x2,t−2
+
1t ,
2t
Hale Waihona Puke Baidu
or just
xt = Φ1xt−1 + Φ2xt−2 + t
(1)
and E( t) = 0,E( t s) = 0 for s = t and
E( t t) =
σ12 σ12 σ21 σ22
.
As you can see, in this example, the vector-valued random variable xt follows a VAR(2) process. A general VAR(p) process with white noise can be written as
E( t s) =
Ω for t = s 0 otherwise
with Ω a (k × k) symmetric positive definite matrix. Recall that in studying the scalar AR(p) process,
φ(L)xt = t,
we have the results that the process {xt} is covariance-stationary as long as all the roots in (2)
(Ik − Φ1L − Φ2L2 − . . . − ΦpLp)(Ik + Ψ1L + Ψ2L2 + . . .) = Ik.
Equating the coefficients of Lj, we have Ψ0 = Ik, Ψ1 = Φ1, Ψ2 = Φ1Ψ1 + Φ2, and in general, we have
1 Covariance-stationary VAR(p) process
1.1 Introduction to stationary vector ARMA processes
1.1.1 VAR processes
A VAR model applies when each variable in the system does not only depend on its own lags, but also the lags of other variables. A simple VAR example is:
1 − φ1z − φ2z2 − . . . − φpzp = 0
(2)
lies out side of the unit circle. Similarly, for the VAR(p) process to be stationary, we must have
that the roots in the equation
x1t = φ11x1,t−1 + φ12x2,t−1 + 1t x2t = φ21x2,t−1 + φ22x2,t−2 + 2t
where E( 1t 2s) = σ12 for t = s and zero for t = s. We could rewrite it as
x1t x2t
=
φ11 φ12 0 φ21
Φ(L)xt = t. We could invert it to
xt = Ψ(L) t
where
Ψ(L) = Φ(L)−1
The coefficients of Ψ can be solved in the same way as in the scalar case, i.e., if Φ−1(L) = Ψ(L), then Φ(L)Ψ(L) = Ik:
Lecture 6: Vector Autoregression
In this section, we will extend our discussion to vector valued time series. We will be mostly interested in vector autoregression (VAR), which is much easier to be estimated in applications. We will fist introduce the properties and basic tools in analyzing stationary VAR process, and then we’ll move on to estimation and inference of the VAR model.
xt = Φ1xt−1 + Φ2xt−2 + . . . + t
p
=
Φj xt−j + t
j=1
or, if we make use of the lag operator,
Φ(L) = t,
where
Φ(L) = Ik − Φ1L − . . . − ΦpLp.
1
The error terms follow a vector white noise, i.e., E( t) = 0,
Ψs = Φ1Ψs−1 + Φ2Ψs−2 + . . . + ΦpΨs−p.
1.2 Transforming to a state space representation
Sometime, it is more convenient to write a scalar valued time series, say an AR(p) process, in vector form. For example,
相关文档
最新文档