Modeling Cycles By ARMA
• Specification
• Identification (Pre-fit)
• Testing (Post-fit)
• Forecasting
Definitions
• Data =Trend + Season+Cycle + Irregular
• Cycle + Irregular = Data – Trend – Season (curves) (dummy variables)
• For this presentation, let:
Yt = Cyclet + Irregulart
Stationary Process For Cycles
Cycle + Irregular =(A) Stationary Process
=(A) ARMA(p, q)
=(A) : Approximation
Stationary Process
• Series Yt is stationary if:
t = , constant for all t
t = , constant for all t
(Yt, Yt+h) = h does not depend on t
• WN is a special example of a stationary process
Models For a Stationary Process
• Autoregressive Process, AR(p)
• Moving Average Process, MA(q)
• Autoregressive Moving Average Process, ARMA(p, q)
Parameters of ARMA Models
Specification Parameters
k Autoregressive Process Parameter
kMoving Average Process Parameter
Characterization Parameters
k Autocorrelation Coefficient
kk Partial Autocorrelation Coefficient
AR Process
• AR (1) : (Yt - ) = 1 (Y(t-1) -) + t
-1 < 1 < 1
(stationarity condition)
• AR (2) : (Yt - ) = 1 (Y(t-1) -) + 2 (Y(t-2) - ) +t
2 + 1 < 1, 2 - 1 < 1 , -1 < 2 < 1
(stationarity condition)
t is a WN ()
MA Process• MA (1) : Yt - = t + 1 (t-1)
- 1 < 1 < 1
(invertibility condition)
• MA (2) : Yt - = t + 1 (t-1) + 2 (t-2)
2 + 1 >-1, 2 - 1 >- 1 , -1 < 2 < 1
(invertibility condition)
t is a WN ()
ARMA (p, q) Models
• ARMA(1, 1):
(Yt - ) = 1 (Y(t-1) - ) + t + 1 (t-1)
• ARMA(2, 1):
(Yt - ) = 1 (Y(t-1) - ) + 2 (Y(t-2) - ) + t + 1 (t-1)
• ARMA(1, 2):
(Yt - ) = 1 (Y(t-1) - ) + t + 1 (t-1) + 2 (t-2)
Wold Theorem
• Any “stationary process” can be defined as a linear combination of a WN series, t
means:
with: sum( ) < inf.
1 ( 1) 2 ( 2)t t t tY b b 2jb
Lag Operator, L
• Lag Operator, L
• Then, the Wold Theorem can be written as:
1 21 21 .....t tY b L b L B L
1t tL
22t tL
Approximation
• Approximation of B(L) by a Simple Rational Polynomial of L
t t t
LY B L
L
21 21 q
qL L L L
21 21 p
pL L L L
Generating AR(1)
• Let: 11
1 1
11t t
LY
LL L
1 1
1 ( 1)
1 ( 1)
1 1 ( 1)
1
1
t t t t t
t t t
t t t
t t t
L Y Y L Y
Y Y
Y Y
Y Y
Generating MA(1)
• Let:
1
1
11
1t t
L LY L
L
1 1t t tY
Generating ARMA(1,1)
• Your Exercise
AR, MA or ARMA?Pre-Fitting Model Identification
• Using ACF and PACF
Partial Autocorrelation Function:PACF
• Notation: – The partial autocorrelation of order k is denoted as
kk
• Interpretation:
kk = Correlation (Yt, Y(t-k) Y(t-1) ,..., Y(t-k+1) )
Yt, {Y(t-1), Y(t-2), ... , Y(t-k+1)}, Y(t-k)
Patterns of ACF and PACF
• AR processes
• MA processes
• ARMA processes
Model Diagnostics – Post Fit
• Residual Check:– Correlogram of the Residual
– QLB Statistic (m - # of parameters)
• SE
• Test of Significance of Coefficients
• AIC, SIC
AIC and SIC
_L log likelihood
2 2L K
AICT T
2 log log2
K TLSIC
T T
(Maximized)
(Minimized)
Truth is Simple
• Parsimony– Use a minimum number of unknown
parameters
Importance of Parsimony
A. In-Sample RMSE (SE) of Model Prediction
vs.
B. Out-of-Sample RMSE
The two should not differ much.
Eview Commands
• AR– ls series_name c ar(1) ar(2)..
• MA– ls series_name c ma(1) ma(2)..
• ARMA– ls series_name c ar(1) ar(2)….ma(1) ma(2)….
Forecasting Rules
• Sample range: 1 to T. Forecast T+h for h=1,2,…
• Write the model, with all unknown parameters replaced by their estimates.
• Write the information set T (only necessary part)
• The unknown errors are given 0.
• Use the chain rule.
Interval Forecast
• h=1– Use SE of Regression for setting the upper and the
lower limits
• h=2– a) AR(1)
– b) MA(1)
– c) ARMA(1,1)
211 SE
211 SE
21 11 SE