Generate \(n=100\) observations from the autoregression \[ x_t = -0.9 x_{t-1} + \omega_t \] with \(\omega_t\) following a Gaussian white noise with standard deviation \(\sigma_\omega=1\) and \(x_0=0\). Next, apply the moving average filter \[ v_t = (x_t+x_{t-1}+x_{t-2}+x_{t-3})/4 \] to \(x_t\), the data you generated.
set.seed(1234)
n = 100
x = rep(0,n)
omega = rnorm(n,0,1)
x[1] = omega[1] # x[0]=0
for (t in 2:n)
x[t] = -0.9*x[t-1]+omega[t]
v = rep(0,n)
for (t in 4:n)
v[t] = sum(x[t:(t-3)])/4
Now plot \(x_t\) as a line and superimpose \(v_t\) as a dashed line. Comment on the behavior of \(x_t\) and how applying the moving average filter changes that behavior.
plot(4:n,x[4:n],xlab="Time",ylab="",type="l")
lines(4:n,v[4:n],lty=2,col=2,lwd=2)
legend("topright",legend=c("x(t)","v(t)"),col=1:2,lty=c(1,2),lwd=2,bty="n")
but with \(x_t = cos(2 \pi t/4)\).
x = cos(2*pi*(1:n)/4)
v = rep(0,n)
for (t in 4:n)
v[t] = sum(x[t:(t-3)])/4
plot(4:n,x[4:n],xlab="Time",ylab="",type="l")
lines(4:n,v[4:n],lty=2,col=2,lwd=2)
legend("topright",legend=c("x(t)","v(t)"),col=1:2,lty=c(1,2),lwd=2,bty="n")
but with added \(N(0,1)\) noise, \(x_t = cos(2 \pi t/4) + \omega_t\).
x = cos(2*pi*(1:n)/4)+rnorm(n,0,1)
v = rep(0,n)
for (t in 4:n)
v[t] = sum(x[t:(t-3)])/4
plot(4:n,x[4:n],xlab="Time",ylab="",type="l")
lines(4:n,v[4:n],lty=2,col=2,lwd=2)
legend("topright",legend=c("x(t)","v(t)"),col=1:2,lty=c(1,2),lwd=2,bty="n")
Well, well, well. Both (b) and (c), and therefore (d), are ruined because I forgot to divide the cosine argument by 4 in the homework assignment. I will give full credit for (b),(c) and (d) for everyone. The main message was that moving averages might be able to get rid of or (attenuate the effect of) cyclical components.
Consider the time series \[ x_t = \beta_1 + \beta_2 t + \omega_t, \] where \(\beta_1\) and \(\beta_2\) are known constants and \(\omega_t\) is a white noise process with variance \(\sigma_{\omega}^2\).
It is not since \(E(x_t) = \beta_1+\beta_2 t\) is a function of \(t\)!
\[ x_t - x_{t-1} = \beta_2 + (\omega_t-\omega_{t-1}), \] which is stationary since \(v_t=\omega_t-\omega_{t-1}\) is stationary: \[\begin{eqnarray*} E(v_t) &=& 0\\ V(v_t) &=& 2\sigma^2\\ Cov(v_t,v_{t-1})&=&Cov(\omega_t-\omega_{t-1},\omega_{t-1}-\omega_{t-2})=\sigma^2\\ Cov(v_t,v_{t-k})&=&Cov(\omega_t-\omega_{t-1},\omega_{t-k-1}-\omega_{t-k-2})=0 \qquad k>1. \end{eqnarray*}\]
\[ v_t = \frac{1}{2q+1} \sum_{j=-q}^q x_{t-j} \] is \(\beta_1+\beta_2 t\), and give a simplified expression for the autocovariance function.
It is easy to see that \[\begin{eqnarray*} E(v_t) &=& \frac{1}{2q+1} \sum_{j=-q}^q (\beta_0+\beta_1 t - \beta_1 j)\\ &=& \frac{1}{2q+1}\left( (2q+1)(\beta_0+\beta_1t)-\beta_1\sum_{j=-q}^q j \right)\\ &=& \beta_0+\beta_1 t. \end{eqnarray*}\]
Also, \(cov(v_t,v_{t-h})\), for \(h \geq 0\) (same applies for \(h \leq 0\)) boils down to \[ \frac{1}{(2q+1)^2}Cov\left(\sum_{j=-q}^q \omega_{t-h-j},\sum_{j=-q}^q \omega_{t-j}\right) = \frac{2q+1-h}{(2q+1)^2}\sigma^2_\omega, \] for \(h \leq 2q\) and \(cov(v_t,v_{t-h})=0\) for \(h>2q\).
Consider the random walk with drift model \[ x_t = \delta + x_{t-1} + \omega_t, \] for \(t=1,2,\ldots\), with \(x_0=0\), where \(\omega_t\) is white noise with variance \(\sigma_\omega^2\).
\[\begin{eqnarray*} x_t &=& \delta + x_{t-1} + \omega_t\\ &=& \delta + (\delta + x_{t-2} + \omega_{t-1}) + \omega_t\\ &=& 2\delta + x_{t-2} + \omega_t+\omega_{t-1}\\ &\vdots& \\ &=& \delta t + \sum_{k=1}^t \omega_k. \end{eqnarray*}\]
\[ cov(x_t,x_{t-h}) = cov\left(\delta t + \sum_{k=1}^t \omega_k,\delta (t-h) + \sum_{k=1}^{t-h} \omega_k\right) = (t-h)\sigma^2_\omega. \]
\(cov(x_t,x_{t-h})\) is a function of \(t\), so \(x_t\) is not stationary.
The transformation \[ z_t = x_t-x_{t-1} = \delta + \omega_t, \] is stationary since it is just a white noise shifted by \(\delta\).
Suppose that the simple return of a monthly bond index follows the \(MA(1)\) model \[ x_t = \omega_t + 0.2 \omega_{t-1}, \] where \(\omega_t\) is a white noise process with standard deviation \(\sigma_\omega = 0.025\).
Assume that \(\omega_{100}=0.01\). Compute the 1-step- and 2-step-ahead forecasts of the return at the forecast origin \(t=100\). What are the standard deviations of the associated forecast errors? Also compute the lag-1 and lag-2 autocorrelations of the return series.
It is easy to see that \[\begin{eqnarray*} E(x_{101}|\omega_{100}) &=& E(\omega_{101} + 0.2 \omega_{100}|\omega_{100}) = 0.2\omega_{100}=0.002\\ V(x_{101}|\omega_{100}) &=& V(\omega_{101} + 0.2 \omega_{100}|\omega_{100}) = V(\omega_{101}) = \sigma_\omega^2=0.025^2.\\ E(x_{102}|\omega_{100}) &=& E(\omega_{102} + 0.2 \omega_{101}|\omega_{100}) = 0\\ V(x_{102}|\omega_{100}) &=& V(\omega_{102} + 0.2 \omega_{101}) = (1+0.2^2)\sigma_\omega^2=0.0255^2. \end{eqnarray*}\]
Obviously, the lag-2 autorrelation is zero since the lag-2 autocovariance is zero. In fact, all lag-k autocorrelations are zero for \(k \geq 2\). Also, the lag-1 aucorrelation is given by \[ corr(x_t,x_{t-1}) = \frac{cov(\omega_t+0.2\omega_{t-1},\omega_{t-1}+0.2\omega_{t-2})}{V(\omega_t+0.2\omega_{t-1})} = \frac{0.2cov(\omega_{t-1},\omega_{t-1})}{(1+0.2^2)\sigma^2_\omega}= \frac{0.2\sigma^2_\omega}{(1+0.2^2)\sigma^2_\omega} = \frac{0.2}{1.04}=0.1923 \]
Suppose that the daily log return of a security follows the model \[
x_t = 0.01 + 0.2 x_{t-2} + \omega_t
\] where \(\omega_t\) is a Gaussian white noise series with mean zero and variance \(\sigma_\omega^2=0.02\). What are the mean and variance of the return series \(x_t\)?
Compute the lag-1 and lag-2 autocorrelations of \(x_t\). Assume that \(x_{100}=-0.01\), and \(x_{99} = 0.02\). Compute the 1- and 2-step-ahead forecasts of the return series at the forecast origin \(t=100\). What are the associated standard deviations of the forecast errors?
For a stationary AR(2) process \(x_t = \alpha + 0 x_{t-1} + \beta x_{t-2} + \omega_t\), where \(\omega_t \sim N(0,\sigma_\omega^2)\), the unconditional mean and unconditional variance can be obtained by noticing that \[\begin{eqnarray*} E(x_t) = \alpha + \beta E(x_{t-2})\\ V(x_t) = \beta^2 V(x_{t-2}) + \sigma_\omega^2, \end{eqnarray*}\] so \[ E(x_t) = \frac{\alpha}{1-\beta} \ \ \ \mbox{and} \ \ \ V(x_t) = \frac{\sigma^2_\omega}{1-\beta^2}. \] Therefore, for \(\alpha=0.01\), \(\beta=0.2\) and \(\sigma_\omega^2=0.02\), \(E(x_t)=0.01/(1-0.2)=0.0125\) and \(V(x_t)=0.02/(1-0.2^2)=0.02083333\).
From the course notes, the autocorrelation function of an AR(2) process is given by \(\rho_1=0\), \(\rho_2=\beta\) and \(\rho_l = \beta \rho_{l-2}\), for \(l=3,4,\ldots\).
n = 1000
x = rep(0,n)
for (t in 3:n)
x[t] = 0.01+0.9*x[t-2]+rnorm(1,0,sqrt(0.02))
acf(x,main="beta=0.9 for illustration")
Finally, \[\begin{eqnarray*} x_{101} = \alpha+\beta x_{99} + \omega_{101}\\ x_{102} = \alpha+\beta x_{100} + \omega_{102} \end{eqnarray*}\] so \[\begin{eqnarray*} E(x_{101}|x_{100},x_{99}) = \alpha+\beta x_{99}=0.014\\ E(x_{102}|x_{100},x_{99}) = \alpha+\beta x_{100}=0.008, \end{eqnarray*}\] while \(V(E(x_{101}|x_{100},x_{99})=E(x_{102}|x_{100},x_{99})=\sigma_\omega^2=0.02\).