GARCH(1,1) Volatility Model

A comprehensive exploration of the Generalized Autoregressive Conditional Heteroskedasticity model for time series volatility forecasting.

Mathematical Foundation

\[r_t = \mu + \epsilon_t\] \[\epsilon_t = \sigma_t z_t\] \[z_t \sim N(0,1) \text{ (i.i.d.)}\] \[\sigma_{t+1}^2 = \omega + \alpha \epsilon_t^2 + \beta \sigma_t^2\]
Understanding the Components

The GARCH(1,1) model forecasts tomorrow's volatility as a weighted average of:

  • ω (omega) Long-term average variance
  • α (alpha) Impact of today's shock (recent return)
  • β (beta) Persistence of past volatility
Stationarity Condition: For the process to be stationary, we require: \[1 - \alpha - \beta > 0\] This ensures the unconditional variance is finite.

Maximum Likelihood Estimation

Assuming \(\epsilon_t\) (today's return) follows a normal distribution with mean 0 and variance \(\sigma_t^2\) (the forecast we made yesterday - conditional volatility), we can derive the log-likelihood function:

\[l(\theta) = \log \sum_{t=1}^T\frac{1}{\sqrt{2\pi\sigma_t^2}} \exp\left(-\frac{\epsilon_t^2}{2\sigma_t^2}\right)\] \[l(\theta) = -\frac{T}{2}\ln(2\pi) - \frac{1}{2}\sum_{t=1}^{T}\ln(\sigma^2_t) - \frac{1}{2}\sum_{t=1}^T\frac{\epsilon_t^2}{\sigma_t^2}\]

The model has 4 parameters to estimate:

  • μ (mu) Average return
  • ω (omega) Long-term average variance
  • α (alpha) How much a recent shock affects future volatility
  • β (beta) How persistent volatility is
Unconditional Variance: \[\frac{\omega}{1-\alpha-\beta}\] This represents the long-run average variance the process reverts to.

Interactive GARCH Simulation

Simulated Returns
Conditional Variance (σ²)

Python Implementation with MLE

ARCH library documentations here: ARCH library documentations

Here's the complete Python implementation of the GARCH(1,1) model with MLE estimation from the notebook:

import numpy as np
from scipy.optimize import minimize
import pandas as pd
import matplotlib.pyplot as plt

def negative_log_likelihood(params, returns):
    """Negative log-likelihood (we minimize this)"""
    mu, omega, alpha, beta = params
    
    # Check constraints
    if omega <= 1e-8 or alpha < 0 or beta < 0 or alpha + beta >= 0.999:
        return 1e10  
    
    n = len(returns)
    epsilon = returns - mu
    
    # Initialize variance
    sigma2 = np.zeros(n)
    sigma2[0] = omega / (1 - alpha - beta)
    
    # Recursive variance calculation
    for t in range(1, n):
        sigma2[t] = omega + alpha * epsilon[t-1]**2 + beta * sigma2[t-1]
    
    # Log-likelihood
    llf = -0.5 * np.sum(np.log(2*np.pi) + np.log(sigma2) + epsilon**2 / sigma2)
    
    return -llf  # Return negative for minimization

def simulate_garch(n, omega=0.05, alpha=0.1, beta=0.85, mu=0.0, seed=None):
    """
    Simulate GARCH(1,1) process
    """
    if seed is not None:
        np.random.seed(seed)
    
    returns = np.zeros(n)
    sigma2 = np.zeros(n)  
    
    # Initial variance (unconditional)
    sigma2[0] = omega / (1 - alpha - beta)
    returns[0] = mu + np.sqrt(sigma2[0]) * np.random.randn()
    
    for t in range(1, n):
        # Update conditional variance
        sigma2[t] = omega + alpha * (returns[t-1] - mu)**2 + beta * sigma2[t-1]
        
        # Generate return
        returns[t] = mu + np.sqrt(sigma2[t]) * np.random.randn()
    
    return returns, sigma2

# Example: Fit GARCH to synthetic data
np.random.seed(42)
n = 1000
returns = np.random.randn(n) * 0.01  # Synthetic returns

initial_guess = [0.0, 0.0001, 0.1, 0.8]
bounds = [(-0.1, 0.1), (1e-8, 0.1), (0, 1), (0, 1)]

result = minimize(
    negative_log_likelihood,
    initial_guess,
    args=(returns,),
    method='L-BFGS-B',
    bounds=bounds,
    options={'maxiter': 1000, 'ftol': 1e-10}
)

print(f"Estimated parameters: mu={result.x[0]:.4f}, "
      f"omega={result.x[1]:.4f}, alpha={result.x[2]:.4f}, beta={result.x[3]:.4f}")