[SOLVED] CS Lecture 5 ARMA Models

$25

File Name: CS_Lecture_5_ARMA_Models.zip
File Size: 226.08 KB

5/5 - (1 vote)

Lecture 5 ARMA Models
. Lochstoer
UCLA Anderson School of Management
Winter 2022

Copyright By Assignmentchef assignmentchef

. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 1 / 91

1 Autoregressive Models
2 Application: Bond Pricing
3 Moving Average Models
4 ARMA Models
5 References
6 Appendix
. Lochstoer UCLA Anderson School of Management ()
Lecture 5 ARMA Models Winter 2022

Autoregressive Models
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 3 / 91

ARMA Models
parsimonious description of (univariate) time series (mimicking autocorrelation etc.)
very useful tools for forecasting (and commonly used in industry)
I forecasting sales, earnings revenue growth at the Orm level or at the industry level
I forecasting GPD growth, inaation at the national level
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 4 / 91

Autoregressive process of order 1
lagged returns might be useful in predicting returns. we consider a model that allows for this:
rt+1 =0 +1rt +t+1, t+1 WN(0,2) I ft g represents the enewsi:
t =rt Et1[rt]
t is what you know about the process at t but not at t 1
I Economists often call t the eshocksior einnovationsi. this model is referred to as an AR(1)
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 5 / 91

Transition density DeOnition
Given an information set Ft, the transition density of a random variable rt+1 is the conditional distribution of rt+1 given by:
rt+1 p(rt+1jFt;)
The information set Ft is often (but not always) the history of the process
rt,rt1,rt2,
In this case, the transition density is written:
rt+1  p(rt+1jrt,rt1,,;)
A transition density is Markov if it depends on its Onite past.
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 6 / 91

AR(1) transition density
Consider the AR(1) model with Gaussian shocks
r t + 1 = 0 + 1 r t + t + 1 , t  N ( 0 , 2 )
The transition density is Markov of order 1.
rt+1 p(rt+1jrt;)
the rest of the history rt2, rt3, . . . is irrelevant. With Gaussian shocks t , the transition density is:
rt+1  N(0+1rt,2) conditional mean and conditional variance:
E[rt+1jrt] = 0+1rt, V[rt+1jrt] = V[t+1]=2.
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 7 / 91

Unconditional mean of AR(1)
assume that the series is covariance-stationary
compute the unconditional mean . I take unconditional expectations:
E [rt +1 ] = 0 + 1 E [rt ] . I use stationarity: E [rt+1] = E [rt] = :
= 0 + 1 , and solving for the unconditional mean:
= 0 . 11
meanexistsif1 6=1andiszeroif0 =0
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 8 / 91

Mean Reversion
if 1 6= 1, we can rewrite the AR(1) process as:
rt+1 = 1 (rt )+t+1.
suppose0<1 <1I when rt > , the process is expected to get closer to the mean:
Et[rt+1 ] = 1 (rt ) < (rt ).I when rt < , the process is expected to get closer to the mean:Et[rt+1 ] = 1 (rt ) > (rt ).
the smaller 1, the higher the speed of mean reversion
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 9 / 91

Mean Reversion
we can rewrite the AR(1) process as:
rt+2 = 21 (rt )+1t+1 +t+2.
suppose0<1 <1I when rt > , the process is expected to get closer to the mean:
Et[rt+2 ] = 21 (rt ) < (rt ).I when rt < , the process is expected to get closer to the mean:Et[rt+2 ] = 21 (rt ) > (rt ).
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 10 / 91

we can rewrite the AR(1) process as:
rt+h = h (rt ) + h1t+1 + . . . + t+h.
suppose0<1 <1I at the half-life, the process is expected to cover 1/2 of the distance to theEt[rt+h ] = h1 (rt ) = .5(rt ). the half-life is deOned by setting h1 = 0.5 and solvingh = log(0.5)/ log(1 ). Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 11 / 91 Variance of AR(1)Compute the unconditional variance:take the expectation of the square of:rt+1 = 1 (rt )+t+1.we obtain the following expression for the unconditional variance:V[rt+1]= 2 , 1 21provided that 21 < 1 because the variance has to be positive and bounded covariance stationarity requires thatin addition, if 1 < 1 < 1, we can show that the series is covariancestationary because the mean and variance are Onite . Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 12 / 91 Continuous-Time Model DeOnitionIn a continuous-time model, the log of stock prices, pt = log Pt , follows an Ornstein-Uhlenbeck process if:dpt = (ppt)dt+pdBtContinuous-time version of a discrete-time, Gaussian AR(1) process. Suppose we observe the process (1) at discrete intervals t, then this is equivalent to:pt = +1(pt1)+t2 = (1exp(2t))p.t N(0,1)1 = exp(t). Lochstoer UCLA Anderson School of Management ()Lecture 5 ARMA ModelsWinter 2022 Dynamic Multipliersuse the expression for the mean of the AR(1) to obtain: rt+1 = 1 (rt )+t+1.by repeated substitution, we get:r t = i1 t i + t + 1 ( r 1 ) .value of rt at t is stated as a function of the history of shocks fg=t andits value at time t = 1eect of shocks die out over time provided that 1 < 1 < 1.. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 14 / 91 Dynamic Multiplierscalculate the eect of a change 0 on rt : [ r t ] = t1 . [ r t + j ] = j1 . tin a covariance stationary model, dynamic multiplier only depends on j, not on t Again, note that we need j1j < 1 for a stationary (non-explosive) system whereshocks die out: limj! j1 = 0 . Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 15 / 91MA(inOnity) representationuse the expression for the mean of the AR(1) to obtain: rt+1 = 1 (rt )+t+1.by repeated substitution:r t = i1 t i .i=0 I linear function of past innovations!I Ots into class of linear time series. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 16 / 91 Autocovariances of an AR(1)take the unconditional expectation:(rt )rtj  = 1 (rt1 )rtj +t rtj .this yields:E (rt )rtj  = 1E (rt1 )rtj +E t rtj .or, using notation from Lecture 2:j = 1j1, j>0
0 = 1 1 + 2 , j = 0. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models
note that j = j
Winter 2022 17 / 91

Autocorrelation Function
it immediately implies that the ACF is: j = 1j1,
and 0 = 1
combing these two equations imply that:
j = j1
I exponential decay at a rate 1
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models
Winter 2022

Autocorrelation Function of an AR(1)
11 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2
-0.2 -0.4 -0.6 -0.8
2 4 6 8 10
-0.2 -0.4 -0.6 -0.8
2 4 6 8 10
Autocorrelation Function for AR(1). The left panel considers 1 = 0.8. The right panel considers 1 = 0.8.
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 19 / 91

AR(p) DeOnition
The AR(p) model is deOned as: 2 rt =0+1rt1++prtp+t, t WN 0,
other lagged returns might be useful in predicting returns
similar to multiple regression model with p lagged variables as explanatory variables
the AR(p) is Markov of order p.
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 20 / 91

Conditional Moments
conditional mean and conditional variance:
Ert+1jrt,,rtp+1 = 0+1rt++prtp+1
V rt+1jrt,,rtp+1 = V [t+1] = 2
moments conditional on rt,,rtp+1 are not correlated with rti,i  p
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 21 / 91

consider the model:
rt =0 +1rt1 +2rt2 +t t WN0,2
take unconditional expectations to compute the mean
E [rt] = 0 +1E [rt1]+2E [rt2]
Assuming stationarity and solving for the mean: E [rt ] = = 0
provided that 1 + 2 6= 1.
using this expression for write the model in deviation from means:
rt = 1 (rt1 )+2 (rt2 )+t
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 22 / 91

Autocorrelations of an AR(2)
take the expectation of :
(rt )rtj  = 1 (rt1 )rtj 
this yields:
+2 (rt2 )rtj +t rtj 
E (rt )rtj  = 1E[(rt1 )rtj ]
+ 2E[(rt2 )rtj ]
+ E t rtj  or, using dierent notation:
j = 1j1+2j2,
0 = 1 1 + 2 2 + 2 ,
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models
Winter 2022

Autocorrelations of an AR(2)
j = 1j1+2j2, j2
0 = 1 1 + 1 2 + 2 / 0 , j = 0
which implies that the ACF of an AR(2) satisOes a second-order dierence equation:
1 = 10+21
j = 1j1 + 2j2, j  2
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models
Winter 2022

Roots DeOnition
The second-order dierence equation for the ACF: (11B2B2)j =0,
where B is the back-shift operator: Bj = j1 Note that we can write the above as:
(11B)(12B)j = 0 Intuitively, the AR(2) is an AR(1) on top of another AR(1)
From AR(1) math, we had that each AR(1) is stationary if its autocorrelation is less than one in absolute value.
The irootsij should satisfy similar property for AR(2) to be stationary
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 25 / 91
A useful factorization

Finding the roots
A simple case:
11B 2B2 = (11B)(12B)
= 1(1+2)B+12B2
and so we solve using the relations:
1 = 1+2
2 = 12
The solutions to this are the inverses to the solutions to the second order
polynomial in the scalar-valued x:
(11x 22) = 0,
the solutions to this equation are given by:
x 1 , x 2 = 1  q 21 + 4 2
the inverses are the characteristic roots: 1 = x1 and 2 = x1
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 26 / 91

Roots (real, distinct case)
two characteristic roots: 1 = x1 and 2 = x1 12
both characteristic roots are real-valued if the discriminant is greater than zero: 21 + 42 > 0
I then we can factor the polynomial as:
(11B 2B2) = (11B)(12B)
I two AR(1) models on top of each other
The ACF will decay like an AR(1) at long lags
I Intuition: the eect of the smallest dies out more quickly and you are left eectively with just an AR(1)
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 27 / 91

Roots (complex-valued case)
two characteristic roots: 1 = x1 and 2 = x1 12
both characteristic roots are complex-valued if the discriminant is negative: 21 + 4 2 < 0Then, 1 = x1 and 2 = x1 are complex numbers. 12The ACF will look like damped sine and cosine waves.. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 28 / 91 Autocorrelation for AR(2)0.6 0.4 0.21 2 3 4 5 6 7 8 9 10 lag =1.2, =-.35 =.2, =.35 12 121 0.5 0.8 0.4 0.6 0.3 0.4 0.2 0.2 0.100 1 2 3 4 5 6 7 8 9 101 2 3 4 5 6 7 8 9 10 lag1=-.2, 2=.35lag 1=.6,2=-.40.6 0.4 0.21 2 3 4 5 6 7 8 9 10 lagAutocorrelation Function for AR(2) processes. . Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 29 / 91AR(2) Example: The Dividend Price RatioThe stock market Dividend to Price ratio is:I Sum of last yearis dividends to Orms in the market divided by current marketI A “Valuation Ratio”I Very slow-moving (persistent); quarterly postWW2 data for U.S.:0.06 0.055 0.05 0.045 0.04 0.035 0.03 0.025 0.02 0.015 0.01Stock market D/P ratio0 50 100 150 200 250 Quarterly observation #. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 30 / 91Estimate AR(2) on this variable Stationarity test:I Roots greater than 1, so stationary despite 1 = 1.093 > 1 as 2 = 0.137.
1 1.09319x + 0.13731x 2 = 0 I Unconditional mean:
= 0.00123254 = 0.0279 1 1.09319 + 0.13731
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models
Winter 2022 31 / 91

AR(2) DP prediction
Pred_DP1 = uncond_mean + phi1*(DP(2:end)-uncond_mean) + phi2*(DP(1:end-1)-uncond_mean); Pred_DP2 = uncond_mean + phi1*(Pred_DP1-uncond_mean) + phi2*(DP(2:end)-uncond_mean); Pred_DP3 = uncond_mean + phi1*(Pred_DP2-uncond_mean) + phi2*(Pred_DP1-uncond_mean); etc.
0.06 0.055 0.05 0.045 0.04 0.035 0.03 0.025 0.02 0.015 0.01
DP predicted values
1 quarter ahead
8 quarters ahead
inf inite quarters ahead
0 50 100 150 200 250 Quarterly observation #
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 32 / 91

Stationarity
Recall: The modulus of z = a + bi is jzj = pa2 + b2. Thus, for real numbers the modulus is simply the absolute value.
An AR(1) process is stationary if its characteristic root is less than one, i.e. if 1/x = 1 is less than one in modulus. This condition implies that j = j1 converges to zero as j ! .
An AR(2) process is stationary if the two characteristic roots 1 and 2 (the inverses of the solutions to those two equations) are less than one in modulus.
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 33 / 91

Stationarity of AR(p)
An AR(p) process is stationary if all p characteristic roots of the below polymonial are less than one in modulus
11x22pxp =0 see chapter 2 in Hamilton (1994) for details.
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 34 / 91

Partial Autocorrelation Function DeOnition
The PACF of a stationary series is deOned as fj,jg,j = 1,,n
rt = rt = rt =
0,1 + 1,1rt1 + v1t
0,2 + 1,2rt1 + 2,2rt2 + v2t
0,3 + 1,3rt1 + 2,3rt2 + 3,3rt3 + v3t
These are simple multiple regressions that can be estimated with least squares.
p ,p shows the incremental contribution of rt p to rt over an AR (p 1) model
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 35 / 91

DeOnition
The sample partial autocorrelations (PACF) of a time series are deOned as b1,1,b2,2,,bp,p,,
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 36 / 91

Partial Autocorrelation Function
The PACF of an AR (p) satisOes:
1 p,p ! p as sample size increases
2 j , j ! 0 f o r j > p
for an AR(p) series, the sample PACF cuts o after lag p
) look at the sample PACF to determine an appropriate value of p
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 37 / 91

PACF of Daily Log Returns
Sample partial autocorrelation coefficients
PACF for Daily log Returns on VW-CRSP Index. Two standard error bands around zero. 1926-2007.
0 2 4 6 8 10 12 k-values
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 38 / 91
spacf values

Information Criteria
information criteria help determine the optimal lag length the Akaike (1973) information criterion:
AIC = 2 ln(likelihood ) + 2(number of parameters ) the Bayesian information criterion of Schwarz (1978):
BIC = 2 ln(likelihood ) + ln T (number of parameters ) I the BIC penalty depends on the sample size T
for dierent values of p, compute AIC(p) and/or BIC(p) pick the lag length with the minimum AIC/BIC
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 39 / 91

Manufacturing White Noise
to check the performance of the AR model youive selected: check the residuals!!
residuals should look like white noise
I look at the ACF of the residuals
I perform Ljung-Box test on residuals
I Q(m)  2(m p) where p is the lag length of the AR(p) model
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 40 / 91

Forecasting
suppose we have an AR(p) model
we want to forecast rt+h using all the info Ft available at t
assume we choose the forecast to minimize the mean square error: E hy yprediction 2 i
The conditional mean minimizes the mean squared forecast error. we will come back to optimal forecasting later
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 41 / 91

1-step ahead forecast error
the AR(p) model is given by:
rt+1 = 0 +1rt ++prtp+1 +t+1 take the conditional expectation:
Et [rt+1] = 0 +1rt ++prtp+1 the one-step ahead forecast error:
p vt(1)=rt+10irti+1 =t+1
the variance of the one-step ahead forecast error:
V [ v t ( 1 ) ] = 2
I if t is normally distributed, then the 95 % conOdence interval: 1.96
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 42 / 91

2-step ahead forecast error
the AR(p) model is given by:
rt+2 = 0 +1rt+1 ++prtp+2 +t+2 we just take the conditional expectation:
E t [ r t + 2 ] = 0 + 1 br t ( 1 ) + . . . + p r t p + 2 the two-step ahead forecast error:
vt(2)=1vt(1)+t+2 =1t+1+t+2 the variance of the two-step ahead forecast error:
V [ v t ( 2 ) ] = 2 ( 1 + 21 )
I the variance of the two-step ahead forecast error is larger than the variance of
the one-step ahead forecast error
. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 43 / 91

Multi-step ahead forecast error
The h-step ahead forecast is given by:
r t ( h ) = 0 + i r t ( h i )
wherert(j)=rt+j ifj<0.the h-step ahead forecast converges to the unconditional expectation E (rt )this is referred to as mean reversion. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 44 / 91 Estimation: conditional least squaresassume we observe or can condition on the Orst p observations. AR(p) model is then a linear regression model:rt =0+1rt1+…+prtp+t, t=p+1,…,T using least squares, the Otted model isbrt =b0+b1rt1+…+bprtp and the residual is vt = rt brtthe estimated variance of the residuals is:b 2 = Tt = p + 1 v t 2T 2p 1 . Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 45 / 91 ML Estimationalternatively, we could use maximum likelihood. the log-likelihood function is:lnp(r1,r2,…,rT;)= lnp(rtjrt1,…,r1;)+lnp(r1;)for example, assume Gaussian shocks t then p(rtjrt1,…,rtp;) is normalthe dierence between least squares and ML estimation of (0 , 1 , . . . , p ) are the initial distributions p(r1; ), p(r2jr1; ) . . ..Conditional least squares of an AR(p) drops the Orst p terms in the likelihood.. Lochstoer UCLA Anderson School of Management () Lecture 5 ARMA Models Winter 2022 46 / 91 Example: ML Estimation of AR(1)assume the initial value r1 comes from the stationary dist. unconditional moments:E[r1]= 0 , V[r1]= 2 , 11 121hence, the density p(r1;) of the Orst observation r1 is normal with the above (unconditional) mean and variancefor t > 1, the conditional moments:
E[rtjrt1] = 0+1rt1, V[rtjrt1] = 2
hence, the conditional density p(rt jrt1; ) is normal wit

CS: assignmentchef QQ: 1823890830 Email: [email protected]

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] CS Lecture 5 ARMA Models
$25