Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

The Arbitrage Pricing Theory

model
The CAPM suffers from several limitations, such as the use of a mean-variance framework
and the fact that returns are captured by one risk factor – the market risk factor. In a well-
diversified portfolio, the unsystematic risk of various stocks cancels out and is essentially
eliminated.

The Arbitrage Pricing Theory (APT) model was put forward to address these
shortcomings and offers a general approach of determining the asset prices other than the
mean and variances.

The APT model assumes that the security returns are generated according to multiple
factor models, which consist of a linear combination of several systematic risk factors. Such
factors could be the inflation rate, GDP growth rate, real interest rates, or dividends.

The equilibrium asset pricing equation according to the APT model is as follows:

Here, E[Ri] is the expected rate of return on the i security, αi is the expected return on
the i stock if all factors are negligible, βi,j is the sensitivity of the ithasset to the jth factor,
and Fj is the value of the jth factor that influences the return on the i security.

Since our goal is to find all values of αi and β, we will perform amultivariate linear
regression on the APT model.
Multivariate linear regression of
factor models
Many Python packages, such as SciPy, come with several variants of regression functions.
In particular, the statsmodels package is a complement to SciPy with descriptive statistics
and the estimation of statistical models. The official page for Statsmodels
is https://www.statsmodels.org.

If Statsmodels is not yet installed in your Python environment, run the following command
to do so:

$ pip install -U statsmodels


If you have an existing package installed, the -U switch tells pip to upgrade the selected package to the newest available version.

In this example, we will use the ols function of the statsmodels module to perform an
ordinary least-squares regression and view its summary.

Let's assume that you have implemented an APT model with seven factors that return the
values of Y. Consider the following set of data collected over nine time
periods, t1 to t9. X1 to X7 are independent variables observed at each period. The regression
problem is therefore structured as follows:

A simple ordinary least-squares regression on values of X and Y can be performed with the
following code:

In [ ]:
"""
Least squares regression with statsmodels
"""
import numpy as np
import statsmodels.api as sm

# Generate some sample data


num_periods = 9
all_values = np.array([np.random.random(8) \
for i in range(num_periods)])

# Filter the data


y_values = all_values[:, 0] # First column values as Y
x_values = all_values[:, 1:] # All other values as X
x_values = sm.add_constant(x_values) # Include the intercept
results = sm.OLS(y_values, x_values).fit() # Regress and fit the model

Let's view the detailed statistics of the regression:

In [ ]:
print(results.summary())

The OLS regression results will output a pretty long table of statistical information.
However, our interest lies in one particular section that gives us the coefficients of our APT
model:

===================================================================
coef std err t P>|t| [0.025
-------------------------------------------------------------------
const 0.7229 0.330 2.191 0.273 -3.469
x1 0.4195 0.238 1.766 0.328 -2.599
x2 0.4930 0.176 2.807 0.218 -1.739
x3 0.1495 0.102 1.473 0.380 -1.140
x4 -0.1622 0.191 -0.847 0.552 -2.594
x5 -0.6123 0.172 -3.561 0.174 -2.797
x6 -0.2414 0.161 -1.499 0.375 -2.288
x7 -0.5079 0.200 -2.534 0.239 -3.054

The coef column gives us the coefficient values of our regression for

You might also like