计量经济学课件 ch11
合集下载
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
4 / 30
Further Issues in Using OLS with Time Series Data.
Stability and Dependence
Stability
We need to assume stability: If we allow the relationship between two variables (say yt and xt) to change arbitrarily each time period we cannot hope to learn much about how a change in one variable affects the other variable if we only have access to a single realization, hence we require a definition of stability (in time series this is given by the notion of stationary processes).
I E(xt) is constant (does not vary with t) I Var(xt) is constant, I for any h 1, Cov(xt, xt+h) depends only on h and not on t. Example: The process fεt, t = 1, ...g such as E(εt) = 0, Var(εt) = σ2ε and Cov(εt, εt+h) = 0 is known as a white noise process. It is covariance stationary.
yt = yt 1 + λ(yt yt 1), 0<λ<1
In this case combining these two equations we obtain
yt = γ0 + γ1yt 1 + ຫໍສະໝຸດ Baidu2xt + ut
where γ0 = αλ, γ1 = (1 λ),γ2 = βλ, ut = λvt.
plim X¯ = µX, (Law of Large numbers)
p (X¯ n
µX) a N(0, 1), (Central limit theorem)
σX
5 / 30
Further Issues in Using OLS with Time Series Data.
Stability and Dependence
8 / 30
Further Issues in Using OLS with Time Series Data.
Weakly Dependent Time Series
We need to replace the concept of independence by a different concept in time series.
7 / 30
Further Issues in Using OLS with Time Series Data.
Covariance Stationary Process
Stationarity is important in time series because if we want to understand the relationship between two or more variables using regression analysis we need to assume some sort of stability of the relationship over time. it also simplifies the assumptions required for a LLN and a CLT to hold.
6 / 30
Further Issues in Using OLS with Time Series Data.
Covariance Stationary Process
Definition A stochastic process fxt, t = 1, ..g is covariance stationary if
9 / 30
Further Issues in Using OLS with Time Series Data.
MA(1) Process
A stochastic process is a moving average process of order one, MA(1), if
xt = et + α1et 1, t = 1, 2, . . . with et being a white noise process with variance σ2e . This is a stationary, weakly dependent sequence as variables 1 period apart are correlated, but 2 periods apart they are not. Notice that
yt = α + βxt + vt. Because of frictions in the market the gap between the actual and desired level cannot be closed instantaneously but only with some lag. That is the inventory in time t would equal that at time t 1 plus an adjustment factor.
1 / 30
Further Issues in Using OLS with Time Series Data.
So far we showed that under Assumptions TS.1 to TS.6 the Ordinary Least Squares estimator has exactly the same properties that in the cross sectional case. However, these Assumptions are extremely strong, and will not be satisfied in some models (E.g. Assumption TS.3 [E(utjX) = 0] does not hold if we have lagged dependent variables as regressors). It will be shown that under a different set of Assumptions the inference procedures introduced in the cross-sectional case can be used in such models in the time series context. In some cases we would like to have as regressors lagged dependent variables:
2 / 30
Further Issues in Using OLS with Time Series Data.
Example: Partial adjustment model: Suppose yt is the desired level of inventories of a firm and yt is the actual level and xt is the sales. Assume that the desired level of inventories depends of sales plus a error term
3 / 30
Further Issues in Using OLS with Time Series Data.
Main points:
If lagged dependent variables are included as regressors OLS is biased and is not BLUE (the Gauss Markov theorem does not hold). The justification for the use of the inference procedures relies on Large Sample analysis (If lagged dependent variables are included as regressors OLS is consistent and Asymptotically normal).
A stationary time series is weakly dependent if xt and xt+h are “almost uncorrelated” as h increases. If for a covariance stationary process Corr(xt, xt+h) ! 0 as h ! ∞, we’ll say this covariance stationary process is weakly dependent. We need weak dependence for LLN’s and CLT’s to hold. We won’t give a rigorous technical definition of weak dependent process.
However in the time series context we do not have a random sample as the random variables X1 to Xn are not independent. Dependence is typical with time series: what happens this period and what happened last period are related, as is what happened the period before last and so on. Thus with time-series data we typically have to deal with dependence between the observations (with cross-section we typically do not have to deal with such dependence). We have to introduce key concepts that address these issues.
Dependence
In the cross sectional case we had a random sample fX1, X2, ..., Xng. Each random variable X1 to Xn were independent with the same mean µX and finite variance σ2X. Under independence a Law of Large numbers (LLN) and a Central limit theorem (CLT) would be valid. That is,
BEE2006 - Statistics and Econometrics
Outline: Further Issues in Using OLS with Time Series Data. Wooldridge (2008), Chapter 11
Covariance Stationary Process Weakly Dependent Time Series Assumptions for consistency and asymptotic normality of OLS Autoregressive distributed Lag model Random Walks Transforming Persistent Series
Further Issues in Using OLS with Time Series Data.
Stability and Dependence
Stability
We need to assume stability: If we allow the relationship between two variables (say yt and xt) to change arbitrarily each time period we cannot hope to learn much about how a change in one variable affects the other variable if we only have access to a single realization, hence we require a definition of stability (in time series this is given by the notion of stationary processes).
I E(xt) is constant (does not vary with t) I Var(xt) is constant, I for any h 1, Cov(xt, xt+h) depends only on h and not on t. Example: The process fεt, t = 1, ...g such as E(εt) = 0, Var(εt) = σ2ε and Cov(εt, εt+h) = 0 is known as a white noise process. It is covariance stationary.
yt = yt 1 + λ(yt yt 1), 0<λ<1
In this case combining these two equations we obtain
yt = γ0 + γ1yt 1 + ຫໍສະໝຸດ Baidu2xt + ut
where γ0 = αλ, γ1 = (1 λ),γ2 = βλ, ut = λvt.
plim X¯ = µX, (Law of Large numbers)
p (X¯ n
µX) a N(0, 1), (Central limit theorem)
σX
5 / 30
Further Issues in Using OLS with Time Series Data.
Stability and Dependence
8 / 30
Further Issues in Using OLS with Time Series Data.
Weakly Dependent Time Series
We need to replace the concept of independence by a different concept in time series.
7 / 30
Further Issues in Using OLS with Time Series Data.
Covariance Stationary Process
Stationarity is important in time series because if we want to understand the relationship between two or more variables using regression analysis we need to assume some sort of stability of the relationship over time. it also simplifies the assumptions required for a LLN and a CLT to hold.
6 / 30
Further Issues in Using OLS with Time Series Data.
Covariance Stationary Process
Definition A stochastic process fxt, t = 1, ..g is covariance stationary if
9 / 30
Further Issues in Using OLS with Time Series Data.
MA(1) Process
A stochastic process is a moving average process of order one, MA(1), if
xt = et + α1et 1, t = 1, 2, . . . with et being a white noise process with variance σ2e . This is a stationary, weakly dependent sequence as variables 1 period apart are correlated, but 2 periods apart they are not. Notice that
yt = α + βxt + vt. Because of frictions in the market the gap between the actual and desired level cannot be closed instantaneously but only with some lag. That is the inventory in time t would equal that at time t 1 plus an adjustment factor.
1 / 30
Further Issues in Using OLS with Time Series Data.
So far we showed that under Assumptions TS.1 to TS.6 the Ordinary Least Squares estimator has exactly the same properties that in the cross sectional case. However, these Assumptions are extremely strong, and will not be satisfied in some models (E.g. Assumption TS.3 [E(utjX) = 0] does not hold if we have lagged dependent variables as regressors). It will be shown that under a different set of Assumptions the inference procedures introduced in the cross-sectional case can be used in such models in the time series context. In some cases we would like to have as regressors lagged dependent variables:
2 / 30
Further Issues in Using OLS with Time Series Data.
Example: Partial adjustment model: Suppose yt is the desired level of inventories of a firm and yt is the actual level and xt is the sales. Assume that the desired level of inventories depends of sales plus a error term
3 / 30
Further Issues in Using OLS with Time Series Data.
Main points:
If lagged dependent variables are included as regressors OLS is biased and is not BLUE (the Gauss Markov theorem does not hold). The justification for the use of the inference procedures relies on Large Sample analysis (If lagged dependent variables are included as regressors OLS is consistent and Asymptotically normal).
A stationary time series is weakly dependent if xt and xt+h are “almost uncorrelated” as h increases. If for a covariance stationary process Corr(xt, xt+h) ! 0 as h ! ∞, we’ll say this covariance stationary process is weakly dependent. We need weak dependence for LLN’s and CLT’s to hold. We won’t give a rigorous technical definition of weak dependent process.
However in the time series context we do not have a random sample as the random variables X1 to Xn are not independent. Dependence is typical with time series: what happens this period and what happened last period are related, as is what happened the period before last and so on. Thus with time-series data we typically have to deal with dependence between the observations (with cross-section we typically do not have to deal with such dependence). We have to introduce key concepts that address these issues.
Dependence
In the cross sectional case we had a random sample fX1, X2, ..., Xng. Each random variable X1 to Xn were independent with the same mean µX and finite variance σ2X. Under independence a Law of Large numbers (LLN) and a Central limit theorem (CLT) would be valid. That is,
BEE2006 - Statistics and Econometrics
Outline: Further Issues in Using OLS with Time Series Data. Wooldridge (2008), Chapter 11
Covariance Stationary Process Weakly Dependent Time Series Assumptions for consistency and asymptotic normality of OLS Autoregressive distributed Lag model Random Walks Transforming Persistent Series