伍德里奇计量经济学课件Chapte (11)

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

How is stationarity used in time series econometrics? On a technical level, stationarity simplifies statements of the law of large numbers and the central limit theorem. On a practical level, if we want to understand the relationship between two or more variables using regression analysis, we need to assume some sort of stability y over time: (i) βj do not change over time; (ii) the variance of the error process is constant over time; (iii) the correlation between errors in two adjacent periods is equal to zero.
perspective of stationarity: It can be very difficult to determine whether the data we have collected were generated by a stationary process. However, it is easy to spot certain sequences that are not stationary. For example, a process with a time trend (mean change over time) is clearly nonstationary.
1-11
Theorem 11.1 (Consistency of OLS)
Under TS.1′, TS.2′, and TS.3′, the OLS estimators are consistent:
wenku.baidu.com
1-12
Example 11.1 [Static Model]
Consider a static model with two explanatory variables yt=β0+β1zt1+β2zt2+ut Under weak dependence, the condition sufficient for consistency of OLS is E(ut|zt1,zt2)=0. This rules out omitted variables that are in ut and are correlated with either zt1 or zt2. Important, p Assumption p TS.3′ does not rule out correlation between, say ut-1 and zt1. This type of correlation could arise if zt1 is related to past yt-1, such as zt1=δ0+δ1yt-1+vt For example, zt1 might be a policy variable, such as monthly percentage change in the money supply, and this change depends on last month’s rate of inflation (yt-1). This kind of feed back is allowed under Assumption TS.3΄.
Assumption TS.2′ (No Perfect Collinearity)
Same as Assumption TS.2.
Assumption TS.3′ (Zero Conditional Mean)
The explanatory variables xt=(xt1, xt2,…,xtk) are contemporaneously exogenous as in equation (10.10): E(ut|xt)=0.
1
2013/12/17
1-7
Weakly Dependent Time Series (cont.)
Why is weak dependence important for regression analysis? Essentially, it replaces the assumption of random sampling in implying that the law of large numbers (LLN) and the central limit theorem (CLT) hold.
1-5
Stationary and Nonstationary Time Series (cont.)

1-6
Weakly Dependent Time Series
【Definition】 Loosely speaking, a stationary time series process {xt: t =1,2, …} is said to be weakly dependent if xt and xt+h are almost independent as h increases without bound. A similar statement holds true if the sequence is nonstationary, but then we must assume the concept of being almost i d independent d does d not dependent d d on the h starting i point, i t. 【Definition】 Asymptotically uncorrelated: A covariance stationary time series is weakly dependent if the correlation between xt and xt+h goes to zero “sufficiently quickly” as h → ∞.

1-10
11.2 Asymptotic Properties of OLS
Assumption TS.1′ (Linear and Weakly Dependence)
We assume the model is exactly as in Assumption TS.1, but now we add the assumption that {(xt, yt):t=1,2,…} is stationary and weakly dependent. In particular, the law of large numbers and the central limit theorem can be applied to sample averages. yt = β0 + β1xt1 + . . .+ βkxtk + ut xtj can include lags of dependent and independent variables.

Var x t 1 12 e2 , Corr x t , x t 1 1 / 1 12
However, if we look at variables in the sequence that are two or more time periods apart, these variables are uncorrelated because they are independent.
1-8
The MA(1) Process
A interesting example of weakly dependent sequence is xt=et+α1et-1, t=1,2,…, where {et : t =0, 1, …} is an i.i.d. sequence with zero mean and variance σ2e. The process {xt} is called a moving average process of order one [MA(1)]: xt is a weighted average of et and et-1. Why is an MA(1) process weakly dependent? For adjacent terms: x t 1 et 1 1et , Cov x t , x t 1 1Var et 1 e2
2013/12/17
1-1
1-2
11.1 Stationary and Weakly Dependent Time Series
Stationary and Nonstationary Time Series
【Definition】 Stationary Stochastic Process: The stochastic process {xt : t=1,2, …} is stationary if for every collection of time indices 1 ≤t1 <t2< …< <tm, the joint distribution of (xt1, xt2, …,xtm) is the same as the joint distribution of (xt1+h, xt2+h, …,xtm+h) for all integers h≥1.
1-3
Stationary and Nonstationary Time Series
Practical
1-4
Stationary and Nonstationary Time Series (cont.)
【Definition】 Covariance Stationary Process: A stochastic process {xt: t =1,2, …} with a finite second moment [E(x2t)<∞] is covariance stationary if ( i ) E(xt) is constant; ( ii) Var(xt) is constant; and (iii) for any t, h≥1, Cov(xt, xt+h) depends only on h and not on t. It follows immediately that the correlation between xt and xt+h also depends only on h.
Stationarity
CHAPTER 11 Further Issues in Using OLS with Time Series Data
does requires that the nature of any correlation between adjacent terms is the same across all time period.
1-9
The AR(1) Process
An autoregressive process of order one [AR(1)] yt=ρ1yt-1+et, t=1,2,… The starting point in the sequence is y0 (at time t), and {et: t=1,2,…} is an i.i.d. sequence with zero mean and variance σ2e. We also assume that the et are independent of y0 and that E(y0)=0. The crucial assumption for weak dependence of an AR(1) process is the stability condition |ρ1|<1. |<1 Then, Then we say that {yt} is a stable AR(1) process. For any h≥1, we can derive that Corr y t , y t h Cov y t , y t h / y y 1h Specifically, when h=1, Corr(yt, yt-1)=ρ1 Because|ρ1|<1, ρh1→0, as h→∞, this means that stable AR(1) process is weakly dependent.
相关文档
最新文档