D.1 ARMA and Return Decomposition

Explain how an ARMA(p,q) model fits into the general return notation of $r_t = µ_{t - 1} + ɛ_t$

$µ_{t - 1}$ depends linearly on the past $p$ realized returns and on the past $q$ forecast errors.

$ɛ_t$ is an unforecastable i.i.d random variable


D.2 White Noise

Characterize a White Noise process

  $ɛ$ follows a White Noise process if $\\epsilon_t \\sim i.i.d. (0, \\sigma^2_\\epsilon)$

exactly when it has the following characteristics:

  1. Its Mean equals 0, which means $E(ɛ_i \space| \space 0 \le i \le t)$ = 0.
  2. There is no correlation between samples: $corr(\epsilon_x, \epsilon_y) = 0$ for each pair of x and y (x≠y) (the correlation between any 2 lags is 0). That means the past carries no information about the future.
  3. Its variance $\sigma^2$ is always constant which means $var(\epsilon_i \space | \space 0 \le i \le t) = const$. In this case we can call it homoscedastic as well.

D.3 Gaussian White Noise

Characterize a Gaussian White Noise Process.

$\epsilon_t$ follows a Gaussian White Process if $\epsilon_t \sim i.i.d.\space N(0, \sigma^2_\epsilon)$, which means we have normal distribution here.

A Gaussian White Noise is a White Noise (from D.2) with an additional fourth characteristic, which says:


D.4 Gaussian White Noise

Write down the parameterized probability density function of a Gaussian White Noise Process.

$$ Prob(\epsilon_t) = \dfrac{1}{\sqrt{2\pi\sigma^2_t}}\space e^{-\dfrac{\epsilon^2_t}{2\sigma^2_t}} $$


D.5 Data Generating Processes of some ARMA

Write down the DGP of an AR(1), MA(1), MA(∞) and ARMA(1,1) process.

Notation: $\phi, \space \theta$ are some constant coefficients in $\mathbb{R}$

          $x_t$ is the value of the time series at the day $t$

     $\\epsilon_t$ is a random error variable at day $t$ ( $\\epsilon_t \\sim iid(0, \\sigma^2_t)$ )

AR(1): $x_t = \phi x_{t - 1} \space + \epsilon_t$

MA(1): $x_t = \epsilon_t + \theta \epsilon_{t - 1}$

MA($\infin$):

$$ x_t = \sum_{j = 0}^{\infty} \phi^j \space \epsilon_{t - j} $$

ARMA(1,1): $\phi x_{t - 1} + \theta \epsilon_{t - 1} \space + \epsilon_t$


D.6 Unconditional vs Conditional Forecast

Explain the difference between an unconditional and a conditional forecast.

The unconditional forecast of a time series $x_t$ says what the output will be at some date. It can be denoted as $E[x_t]$.

Whereas the conditional forecast says what the output, based on our set of Information $F_{t-k}$ will be (also denoted as $E[x_t|F_{t-k}]$) In this case we have the k-Step ahead forecast.

For example, with k=1 we have a 1-Step ahead forecast $E[x_t|F_{t-1}]$

The unconditional forecast coincides with conditional k-Step ahead forecast for large values of k (information of today does not help us to predict stocks in 500 years at all):

$lim_{k\rightarrow\infty} E[x_{t+k}| F_t] \approx E[x_{t+k}]$


D.7 Weakly Stationarity