Trading strategy: Re-weight the components of an ETF

This algorithm attempts to rebuild an ETF using different positive weights. It's a long only portfolio hedged by ETF. The idea is the excess returns (returns on top of ETF returns) of each stock can be used to run an optimization problem to maximize these returns with a threshold on variance < variance of ETF. I had to put a constraint on minimum weight of each stock else some stocks had zero weight.

588
Total Returns
--
Alpha
--
Beta
--
Sharpe
--
Sortino
--
Max Drawdown
--
Benchmark Returns
--
Volatility
--
 Returns 1 Month 3 Month 6 Month 12 Month
 Alpha 1 Month 3 Month 6 Month 12 Month
 Beta 1 Month 3 Month 6 Month 12 Month
 Sharpe 1 Month 3 Month 6 Month 12 Month
 Sortino 1 Month 3 Month 6 Month 12 Month
 Volatility 1 Month 3 Month 6 Month 12 Month
 Max Drawdown 1 Month 3 Month 6 Month 12 Month
import numpy as np
import math
import pandas as pd
import statsmodels.api as smapi
from sklearn.covariance import OAS
from cvxopt import solvers, matrix

def initialize(context):
set_symbol_lookup_date('2015-08-01')
context.XLE = sid(19655)
context.stocks = symbols('XOM','CVX','SLB','KMI','EOG','WMB','OXY','VLO','PSX','COP','APC','PXD','TSO','HAL')

set_benchmark(context.XLE)

schedule_function(myfunc,date_rule=date_rules.every_day(),time_rule=time_rules.market_close(minutes=30))

def handle_data(context, data):
record(l=context.account.leverage)
pass

def myfunc(context, data):
prices = history(200, "1d", "price")
xle = prices[context.XLE].pct_change().dropna().values
xle = np.log1p(xle)

prices = prices.dropna(axis=1)
prices = prices.drop([context.XLE], axis=1)

ret = prices.pct_change().dropna().values
ret = np.log1p(ret)

s = []
for i, sid in enumerate(prices):
X = xle
Y = ret[:,i] - xle
s.append(mycov(X,Y))

excess = ret - np.tile(xle, (np.shape(ret)[1],1)).T
scores = reweight(s, OAS().fit(excess).covariance_, np.std(xle))
scores = np.ravel(scores)

wsum = 0
for i, sid in enumerate(prices):
val = 150000 * scores[i] / np.sum(np.abs(scores))
order_target_value(sid,  val)
wsum += val
order_target_value(context.XLE, -wsum)

def mycov(x, y):
xl = pd.ewma(x, span=10)
xs = pd.ewma(x, span=5)
ys = pd.ewma(y, span=5)
yl = pd.ewma(y, span=10)
diff = []

for i in range(0, len(xs)):
diff = (xs[i] - xl[i]) * (ys[i] - yl[i])

return np.mean(diff)

def reweight(ret, cov, xlestd):

numPOS = len(ret)
pbar = ret

U,V = np.linalg.eig(cov)
U[U<0] = 0
Usqrt = np.sqrt(U)
A = np.dot(np.diag(Usqrt),V.T)

G1temp = np.zeros((A.shape[0]+1,A.shape[1]))
G1temp[1:,:] = -A
h1temp = np.zeros((A.shape[0]+1,1))
h1temp[0] = xlestd

for i in np.arange(numPOS):
ei = np.zeros((1,numPOS))
ei[0,i] = 1
if i == 0:
G2temp = [matrix(-ei)]
h2temp = [matrix([[-0.05]])]
else:
G2temp += [matrix(-ei)]
h2temp += [matrix([-0.05])]

Ftemp = np.ones((1,numPOS))
F = matrix(Ftemp)
g = matrix(np.ones((1,1)))

G = [matrix(G1temp)] + G2temp
H = [matrix(h1temp)] + h2temp

solvers.options['show_progress'] = False
sol = solvers.socp(-matrix(pbar),Gq=G,hq=H,A=F,b=g)
xsol = np.array(sol['x'])
return xsol
There was a runtime error.
3 responses

Interesting, thanks for sharing!

One of the very important ingredients to a mean variance optimization is the expected returns. I used some crude way to predict next day's excess returns. Does anyone have any ideas on how to forecast next day/week's returns so that I can supply this to mean variance optimization?

Pravin,

I have no 'coding skills' (yet) to speak of. But, conceptually, I think what you would want to do is write code that does the following:
a) Search for, and prepare a universe of the underlying stock holdings that meet some core liquidity requirements and are core holdings of a specific sector ETF (this could then be extended to all ETF's).
b) Calculate the trailing X period look-back 'beta' (or degree of covariance) for these individual stocks to that sector ETF (not to the SPY).
c) Rank these holdings on some known and vetted basic 'stock alpha factors' vs. the index. (these will vary significantly based on your time frame... but some methods will be discussed below)
d) Select enough of these stocks (say 30) so that the basket is likely to mathematically track the underlying return factor indexes. This is actually very important for the system to work long-term.
e) Weight the stocks based on a mean variance optimization with constraints. However, we are not really interested in forecasting the returns. We are only interested in forecasting the 'excess returns' above the sector ETF while minimizing both variance (and /or DD) based on that.
f) Short the ETF in an amount to make the beta's of the long-short market neutral.
g) (For a more complex - but more interesting system, we would create a 'market view' module / function (I still don't know the difference), and then allow the 'market exposure to vary within some constraints, say from .2 beta target to -.2 beta target).

As far as the 'excess returns'. These can be driven by many, many factors. The conceptual approach I would take is to write off-line scripts that run single factor tests on baskets of 'similar historical' stocks (i.e. stocks in a similar sector and market cap range and same country with similar 'sector beta) to predict the excess returns of that factor on these 'types of stocks' in isolation.

For example EBIT/EV has trailing X look-back period 'excess returns' in this sector. Would probably want to really just use the 'recent' factor performance assuming 'momentum' and persistence to factors / styles.

For initial factors to test:
1. (EBIT/EV)
2. (blended momentum factor - say 30,60 and 90 day momentum as well as 'consistency of' momentum)
Each of the above 'composite factors' would be 'normalized' and weighted at 50%. All stocks would be ranked on their trailing period 'value-mo' score from 1-100.
You would then check how well those 'mo value' scores have predicted 'excess performance' in the past X days (say 30 and 60 days). This module would work really well.
You would use this 'prediction equation' based on the loading on these factors to generate your 'excess return' equation for the optimization.
However, would likely also put in a 'safeguard' of 'constraints' that the weights had to stay within, as well as...
A constraint that looked at how well the forecasts have been doing of late. If the forecasts have been poor,I would just use an equal variance weighting method.

Hope this helps!

There are other factors that will work better then these over very short time periods - but these above factors should work well at various rebalance time frames.

I can't code at all yet, and would be open to a collaboration if you are an expert coder and wanted to build something?

Feel free to email me here or off-line ([email protected]) and we can talk about some.

Hope this helps.
Best,
Tom