lecture2TimeSeriesForecasting.ppt
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
So the first order AR process has been recast as an infinite order MA one.
The Correlogram and partial autocorellation function
• Two important tools for diagnosing the time series properties of a series
• a broadly sensible approach to forecasting but they are not the result of a particular economic or statistical view about the way the data was generated.
WOLD'S Decomposition
for any series (x) which is a covariance stationary stochastic process with E(x) = 0, the process generating x may be written as,
If we define ft to be the forecast of Xt using only past information, then the Holt procedure uses the following formulae to forecast Xt+1.
f t+1 = mt + gt
The correlogram shows the correlation between a variable Xt and a number of past values.
Ci
=
1 T
T -k t=1
( X t+k - X
1 T
T
(
t=1
X
t
* )( -X
Xt* )2
X*
)
where
X*
k -1
X k = w j X t- j
j=1
or
k -1
T -k
X k = w*j X t- j + w* j X t+ j
j=1
j=1
where the w sum to unity w j = (1 - )t- j for 0 < < 1
The basic EWMA model was adapted in Holt (1957) and Winter (1960) so as to allow the model to capture a variable trend term.
j=0
As a general rule, a low order AR process will give rise to a high order MA process and the low order MA process will give rise to a high order AR process.
It is important to remember that, at least in principle, not all series are integrated.
X t = 1.5 X t-1
If we transform this,
X t = X t - X t-1= 0.5 X t-1
xt = j t- j + d t
j=0
where 0
=
1,
j
<,
E(
t
)=0,
E(
2 t
)=
2
Байду номын сангаас
j=0
and E( t s )=0 for t s
dt is termed the linearly deterministic part of x while j t- j
is termed the linearly indeterministic part.
(L)t = t + 1t-1+ ...+ qt-q
This would be referred to as a qth order moving average process, or MA(q).
ARMA models
• A mixture of these two types of model would be referred to as an autoregressive moving average model (ARMA)n,q, where n is the order of the autoregressive part and q is the order of the moving average term.
Integration
An integrated series is one which may be rendered stationary by differencing, so if
Y t = X t = X t - X t-1
and Yt is stationary then X is an integrated process.
Stationarity
We are primarily concerned with weak, or covariance, stationarity, such a series has a constant mean and constant, finite, variance.
The simplest form of stochastic trend is given by the following, random walk with drift, model.
the Exponentially Weighted Moving Average model (EWMA).
If we have a sample Xt, t=1...T and we wish to form an estimate of X at time k then we can do this in one of two ways,
The basic moving average models represents X as a function of current and lagged values of a white noise process.
X t = (L)t where is a white noise error process
=
1 T
T t=1
X
t
the partial autocorrelation function is given as the coefficients from a simple autoregression of the form,
n
X t = A0 + Pi X t-i + ut
i=1
where Pi are the estimates of the partial autocorrelation function.
then we are still left with the level of X on the right hand side of the equation, further differencing will not remove this level effect
`Ad Hoc' forecasting procedures
where g is the expected rate of increase of the series and m is our best estimate of the underlying value of the series.
xt = xt-1+t <1
by successively lagging this equation and substituting out the lagged value of x we may rewrite this as,
xt = j t- j where xt- 0
j=1
Lecture 2
Stephen G Hall
Time Series Forecasting
Introduction
• These are a body of techniques which rely primarily on the statistical properties of the data, either in isolated single series or in groups of series, and do not exploit our understanding of the working of the economy at all.
• See `Applied Economic Forecasting Techniques' ed S G Hall, Simon and Schuster, 1994.
Some basic concepts
• Two basic types of time series models exist,
X t = + X t-1+
where, if X0=0 we can express this as
t
X t =t + t-i
i=1
Now this equation has a stochastic trend, given by the term in the summation of errors, and a deterministic trend given by the term involving t.
X t = (L) X t-1 + t where is a white noise error process and (L) X t-1 = 1 X t-1+ 2 X t-2 ... n X t-n
This would be referred to as an nth order autoregressive process, or AR(n).
• these are autoregressive and moving average models.
What information do we have to forecast a series?
time
The basic autoregressive model for a series X is,
• The objective is not to build models which are a good representation of the economy with all its complex interconnections, but rather to build simple models which capture the time series behaviour of the data and may be used to provide an adequate basis for forecasting alone.
The effect of a shock (or error) will never disapear
If However
X t = + X t-1+ 1
Then
t
X t = c + i t-i
i=1
Then the moving average error term would no longer cumulate and the process would be stationary.
Further if, as above, X only requires differencing once to produce a stationary series it is defined to be integrated of order 1, often denoted as I(1). A series might be I(2) which means that it must be differenced twice before it becomes stationary, etc
The Correlogram and partial autocorellation function
• Two important tools for diagnosing the time series properties of a series
• a broadly sensible approach to forecasting but they are not the result of a particular economic or statistical view about the way the data was generated.
WOLD'S Decomposition
for any series (x) which is a covariance stationary stochastic process with E(x) = 0, the process generating x may be written as,
If we define ft to be the forecast of Xt using only past information, then the Holt procedure uses the following formulae to forecast Xt+1.
f t+1 = mt + gt
The correlogram shows the correlation between a variable Xt and a number of past values.
Ci
=
1 T
T -k t=1
( X t+k - X
1 T
T
(
t=1
X
t
* )( -X
Xt* )2
X*
)
where
X*
k -1
X k = w j X t- j
j=1
or
k -1
T -k
X k = w*j X t- j + w* j X t+ j
j=1
j=1
where the w sum to unity w j = (1 - )t- j for 0 < < 1
The basic EWMA model was adapted in Holt (1957) and Winter (1960) so as to allow the model to capture a variable trend term.
j=0
As a general rule, a low order AR process will give rise to a high order MA process and the low order MA process will give rise to a high order AR process.
It is important to remember that, at least in principle, not all series are integrated.
X t = 1.5 X t-1
If we transform this,
X t = X t - X t-1= 0.5 X t-1
xt = j t- j + d t
j=0
where 0
=
1,
j
<,
E(
t
)=0,
E(
2 t
)=
2
Байду номын сангаас
j=0
and E( t s )=0 for t s
dt is termed the linearly deterministic part of x while j t- j
is termed the linearly indeterministic part.
(L)t = t + 1t-1+ ...+ qt-q
This would be referred to as a qth order moving average process, or MA(q).
ARMA models
• A mixture of these two types of model would be referred to as an autoregressive moving average model (ARMA)n,q, where n is the order of the autoregressive part and q is the order of the moving average term.
Integration
An integrated series is one which may be rendered stationary by differencing, so if
Y t = X t = X t - X t-1
and Yt is stationary then X is an integrated process.
Stationarity
We are primarily concerned with weak, or covariance, stationarity, such a series has a constant mean and constant, finite, variance.
The simplest form of stochastic trend is given by the following, random walk with drift, model.
the Exponentially Weighted Moving Average model (EWMA).
If we have a sample Xt, t=1...T and we wish to form an estimate of X at time k then we can do this in one of two ways,
The basic moving average models represents X as a function of current and lagged values of a white noise process.
X t = (L)t where is a white noise error process
=
1 T
T t=1
X
t
the partial autocorrelation function is given as the coefficients from a simple autoregression of the form,
n
X t = A0 + Pi X t-i + ut
i=1
where Pi are the estimates of the partial autocorrelation function.
then we are still left with the level of X on the right hand side of the equation, further differencing will not remove this level effect
`Ad Hoc' forecasting procedures
where g is the expected rate of increase of the series and m is our best estimate of the underlying value of the series.
xt = xt-1+t <1
by successively lagging this equation and substituting out the lagged value of x we may rewrite this as,
xt = j t- j where xt- 0
j=1
Lecture 2
Stephen G Hall
Time Series Forecasting
Introduction
• These are a body of techniques which rely primarily on the statistical properties of the data, either in isolated single series or in groups of series, and do not exploit our understanding of the working of the economy at all.
• See `Applied Economic Forecasting Techniques' ed S G Hall, Simon and Schuster, 1994.
Some basic concepts
• Two basic types of time series models exist,
X t = + X t-1+
where, if X0=0 we can express this as
t
X t =t + t-i
i=1
Now this equation has a stochastic trend, given by the term in the summation of errors, and a deterministic trend given by the term involving t.
X t = (L) X t-1 + t where is a white noise error process and (L) X t-1 = 1 X t-1+ 2 X t-2 ... n X t-n
This would be referred to as an nth order autoregressive process, or AR(n).
• these are autoregressive and moving average models.
What information do we have to forecast a series?
time
The basic autoregressive model for a series X is,
• The objective is not to build models which are a good representation of the economy with all its complex interconnections, but rather to build simple models which capture the time series behaviour of the data and may be used to provide an adequate basis for forecasting alone.
The effect of a shock (or error) will never disapear
If However
X t = + X t-1+ 1
Then
t
X t = c + i t-i
i=1
Then the moving average error term would no longer cumulate and the process would be stationary.
Further if, as above, X only requires differencing once to produce a stationary series it is defined to be integrated of order 1, often denoted as I(1). A series might be I(2) which means that it must be differenced twice before it becomes stationary, etc