Lab 3: Moment-Based Estimation Approach
Introduction
In this lab, we will implement a moment-based approach for parameter estimation. This method relies on matching theoretical moments from our model to empirical moments calculated from the data.
Part 1: Constructing the Variance-Covariance Matrix from Data
First, we need to construct the variance-covariance matrix of earnings growth from our empirical data.
import numpy as np
import pandas as pd
def construct_empirical_covariance(data):
"""
Construct the variance-covariance matrix from earnings growth data.
Parameters:
data: pandas DataFrame with earnings growth data
Returns:
numpy array: variance-covariance matrix
"""
# Calculate earnings growth rates
earnings_growth = data.pct_change().dropna()
# Compute the covariance matrix
cov_matrix = np.cov(earnings_growth.T)
return cov_matrix
Part 2: Theoretical Variance-Covariance Matrix from Model
Next, we need to construct the theoretical variance-covariance matrix based on our model parameters. We'll use the actual expressions of the moments.
def construct_theoretical_covariance(parameters):
"""
Construct the theoretical variance-covariance matrix from model parameters.
Parameters:
parameters: dict containing model parameters
Returns:
numpy array: theoretical variance-covariance matrix
"""
# Extract parameters
sigma = parameters['sigma']
rho = parameters['rho']
n_assets = parameters['n_assets']
# Construct the theoretical covariance matrix
# This is a simplified example - adjust based on your specific model
cov_matrix = np.zeros((n_assets, n_assets))
for i in range(n_assets):
for j in range(n_assets):
if i == j:
cov_matrix[i, j] = sigma**2
else:
cov_matrix[i, j] = rho * sigma**2
return cov_matrix
Part 3: Objective Function for Optimization
Now we'll create the objective function that will be passed to an optimizer. We'll use automatic differentiation for efficient gradient computation.
import jax
import jax.numpy as jnp
from jax import grad, jit
@jit
def moment_matching_objective(parameters, empirical_moments):
"""
Objective function for moment matching estimation.
This function will be optimized using auto-differentiation.
Parameters:
parameters: array of model parameters
empirical_moments: empirical variance-covariance matrix (flattened)
Returns:
float: objective function value (sum of squared differences)
"""
# Unpack parameters
sigma, rho = parameters[0], parameters[1]
# Construct theoretical moments
n_assets = int(np.sqrt(len(empirical_moments)))
theoretical_cov = construct_theoretical_covariance_jax(sigma, rho, n_assets)
theoretical_moments = theoretical_cov.flatten()
# Calculate the sum of squared differences
diff = theoretical_moments - empirical_moments
objective = jnp.sum(diff**2)
return objective
def construct_theoretical_covariance_jax(sigma, rho, n_assets):
"""
JAX-compatible version of theoretical covariance construction.
"""
cov_matrix = jnp.zeros((n_assets, n_assets))
# Use JAX operations for differentiability
diagonal_elements = sigma**2
off_diagonal_elements = rho * sigma**2
# Create the matrix
cov_matrix = jnp.eye(n_assets) * diagonal_elements + \
(jnp.ones((n_assets, n_assets)) - jnp.eye(n_assets)) * off_diagonal_elements
return cov_matrix
# Compute the gradient function
grad_objective = grad(moment_matching_objective)
Part 4: Running the Optimization
Finally, we'll set up and run the optimization:
from scipy.optimize import minimize
def estimate_parameters(data, initial_guess):
"""
Estimate model parameters using moment matching.
Parameters:
data: pandas DataFrame with earnings data
initial_guess: initial parameter values
Returns:
optimization result
"""
# Calculate empirical moments
empirical_cov = construct_empirical_covariance(data)
empirical_moments = empirical_cov.flatten()
# Define the objective function for scipy
def objective_wrapper(params):
return float(moment_matching_objective(params, empirical_moments))
def gradient_wrapper(params):
return np.array(grad_objective(params, empirical_moments))
# Run optimization
result = minimize(
objective_wrapper,
initial_guess,
method='BFGS',
jac=gradient_wrapper,
options={'disp': True}
)
return result
# Usage example:
# result = estimate_parameters(your_data, initial_guess=[0.1, 0.3])
# print(f"Estimated parameters: sigma={result.x[0]:.4f}, rho={result.x[1]:.4f}")
Key Concepts
- Moment Matching: We match theoretical moments (from our model) to empirical moments (from data)
- Variance-Covariance Matrix: Captures the relationships between different assets' earnings growth
- Automatic Differentiation: JAX provides efficient gradient computation for optimization
- Objective Function: Minimizes the sum of squared differences between theoretical and empirical moments
Exercise
- Load your earnings data and compute the empirical variance-covariance matrix
- Define your model parameters and construct the theoretical moments
- Set up the optimization problem using the provided framework
- Estimate the parameters and analyze the results