time of day dependence?

I was just goofing around with this algo, and set the trading time to mid-day. The results were relatively poor. Then I switched to one hour after open, and got this result. There seems to be a time-of-day dependence, although it demands more investigation.

Thought I'd share in case someone has insights, or would like to do some testing.

Grant

765
Backtest from to with initial capital
Total Returns
--
Alpha
--
Beta
--
Sharpe
--
Sortino
--
Max Drawdown
--
Benchmark Returns
--
Volatility
--
 Returns 1 Month 3 Month 6 Month 12 Month
 Alpha 1 Month 3 Month 6 Month 12 Month
 Beta 1 Month 3 Month 6 Month 12 Month
 Sharpe 1 Month 3 Month 6 Month 12 Month
 Sortino 1 Month 3 Month 6 Month 12 Month
 Volatility 1 Month 3 Month 6 Month 12 Month
 Max Drawdown 1 Month 3 Month 6 Month 12 Month
# Adapted from:
# Li, Bin, and Steven HOI. "On-Line Portfolio Selection with Moving Average Reversion." The 29th International Conference on Machine Learning (ICML2012), 2012.
# http://icml.cc/2012/papers/168.pdf

import numpy as np
from scipy import optimize
import pandas as pd

def initialize(context):

context.eps = 1.005
# context.pct_index = 1 # max percentage of inverse ETF
context.leverage = 1.0

print 'context.eps = ' + str(context.eps)
# print 'context.pct_index = ' + str(context.pct_index)
print 'context.leverage = ' + str(context.leverage)

context.data = []

set_benchmark(sid(19920)) # QQQ

fundamental_df = get_fundamentals(
query(
fundamentals.valuation.market_cap,
)
.filter(fundamentals.company_reference.primary_exchange_id == 'NAS')
.filter(fundamentals.valuation.market_cap != None)
.order_by(fundamentals.valuation.market_cap.desc()).limit(20))
update_universe(fundamental_df.columns.values)
context.stocks = [stock for stock in fundamental_df]

# context.stocks.append(symbols('PSQ')) # add inverse ETF to universe

# check if data exists
for stock in context.stocks:
if stock not in context.data:
context.stocks.remove(stock)

def handle_data(context, data):

record(leverage = context.account.leverage)

context.data = data

def get_allocation(context,data,n,prices):

prices = pd.ewma(prices,span=390).as_matrix(context.stocks)

b_t = []

for stock in context.stocks:
b_t.append(context.portfolio.positions[stock].amount*data[stock].price)

m = len(b_t)
b_0 = np.ones(m) / m  # equal-weight portfolio
denom = np.sum(b_t)

if denom == 0.0:
b_t = np.copy(b_0)
else:
b_t = np.divide(b_t,denom)

x_tilde = []

for i, stock in enumerate(context.stocks):
mean_price = np.mean(prices[:,i])
x_tilde.append(mean_price/prices[-1,i])

bnds = []
limits = [0,1]

for stock in context.stocks:
bnds.append(limits)

# bnds[-1] = [0,context.pct_index] # limit exposure to index

bnds = tuple(tuple(x) for x in bnds)

cons = ({'type': 'eq', 'fun': lambda x:  np.sum(x)-1.0},
{'type': 'ineq', 'fun': lambda x:  np.dot(x,x_tilde) - context.eps})

res= optimize.minimize(norm_squared, b_0, args=b_t,jac=norm_squared_deriv,method='SLSQP',constraints=cons,bounds=bnds, options={'disp': False,  'maxiter': 100, 'iprint': 1, 'ftol': 1e-6})

allocation = res.x
allocation[allocation<0] = 0
allocation = allocation/np.sum(allocation)

if res.success and (np.dot(allocation,x_tilde)-context.eps > 0):
return (allocation,np.dot(allocation,x_tilde))
else:
return (b_t,1)

# check if data exists
for stock in context.stocks:
if stock not in data:
context.stocks.remove(stock)

# check for de-listed stocks & leveraged ETFs
for stock in context.stocks:
if stock.security_end_date < get_datetime():  # de-listed ?
context.stocks.remove(stock)
if stock in security_lists.leveraged_etf_list: # leveraged ETF?
context.stocks.remove(stock)

# check for open orders
if get_open_orders():
return

# find average weighted allocation over range of trailing window lengths
a = np.zeros(len(context.stocks))
w = 0
prices = history(8*390,'1m','price')
for n in range(1,9):
(a,w) = get_allocation(context,data,n,prices.tail(n*390))
a += w*a
w += w

allocation = a/w
allocation = allocation/np.sum(allocation)

allocate(context,data,allocation)

def allocate(context, data, desired_port):

record(long = sum(desired_port))
# record(inverse = desired_port[-1])

for i, stock in enumerate(context.stocks):
order_target_percent(stock, context.leverage*desired_port[i])

for stock in data:
if stock not in context.stocks:
order_target_percent(stock,0)

def norm_squared(b,*args):

b_t = np.asarray(args)
delta_b = b - b_t

return 0.5*np.dot(delta_b,delta_b.T)

def norm_squared_deriv(b,*args):

b_t = np.asarray(args)
delta_b = b - b_t

return delta_b
We have migrated this algorithm to work with a new version of the Quantopian API. The code is different than the original version, but the investment rationale of the algorithm has not changed. We've put everything you need to know here on one page.
There was a runtime error.
8 responses

@Grant,

This short research document may be of interest to you. I ran this many years ago, but was reminded of it by your post. In it I used proprietary software (where I used to work) to test many variations of the strategy which essentially answers the question of whether there time differential correlation in the markets. What is presented are only those scenarios which stood out.

https://dl.dropboxusercontent.com/u/217878013/Finance/TimeDifferential/index.html

Each image has the description of its particular run.

It's similar to your strategy only in that the time of day is the primary input for trading. I believe I've coded up similar strategies for folks here; time of day dependent algos.

Some algos have a legitimate time of day dependency, some have a data snooping bias that worsens, and some just have a random dependency on time of day. In my experience.

I'm new to Q, but have been looking at some of your postings. I've done algorithmic trading for about 1.5 years, without having encountered this platform previously.

I say that by way of introduction to indicate that for most stocks, I've also noted a huge volume difference in the first and last 15 or so minutes of the day versus otherwise. My instinct (never really proven like what you've shown) has been that folks take money off the table in terms of profits in those intervals, particularly if the opening price is a jump up from the last closing. As such, I've waited until 9:45am to place most trades on my screens, and it typically seems to work better than a 9:30 trade.

@Market TEch... may I ask... what is the name... of that proprietary software....? is that available in the market or just internally in your firm. .thanks...

http://www.4thstory.com/ is the company. They don't do retail.

One possibility is that this is an artifact of Quantopian's slippage model. It is quite punishing if you have the bad luck to put a relatively large order on on a small volume minute. I'd hypothesize that volumes tend to be less at mid-day than an hour after the open, hence you're getting hit by the slippage model more dramatically at mid-day than at the open?

I am new to Quantopian's famly. I try to implement the trading algorithm based on Market Tech's study.
However, my code seems not working well. I am wondering whether I miss anything in my code. Any comment and revision will be very appreciated.

0
Backtest from to with initial capital
Total Returns
--
Alpha
--
Beta
--
Sharpe
--
Sortino
--
Max Drawdown
--
Benchmark Returns
--
Volatility
--
 Returns 1 Month 3 Month 6 Month 12 Month
 Alpha 1 Month 3 Month 6 Month 12 Month
 Beta 1 Month 3 Month 6 Month 12 Month
 Sharpe 1 Month 3 Month 6 Month 12 Month
 Sortino 1 Month 3 Month 6 Month 12 Month
 Volatility 1 Month 3 Month 6 Month 12 Month
 Max Drawdown 1 Month 3 Month 6 Month 12 Month
'''
Hypothesis:
Price movement during a specific time span of the trading day correlates with price movement of an earlier time span of the same trading day.
Example: A positive price move from 9:30 am to 10:00 am will result in a positive move up from 12:30 pm to 1:00 pm.

Ref: https://www.quantopian.com/posts/time-of-day-dependence
'''

import numpy as np
import pandas as pd

def initialize(context):
context.stock = sid(8554)
context.exchange_time = 0
context.changes = 0;
schedule_function(move_930_1000,
date_rules.every_day(),
time_rules.market_open(minutes=30))
schedule_function(react_1230,
date_rules.every_day(),
time_rules.market_open(hours=3))
schedule_function(react_1300,
date_rules.every_day(),
time_rules.market_open(hours=3, minutes=30))

def handle_data(context, data):
# Convert to EST timezone
context.exchange_time = pd.Timestamp(get_datetime()).tz_convert('US/Eastern')
return

def move_930_1000(context, data):
prices = history(30, '1m', 'price')[context.stock]
changes = (prices.iloc - prices.iloc[-1]) / prices.iloc[-1]
context.changes = changes
log.info("Price at 9:30: " +  str(prices.iloc[-1]))
log.info("Price at 10:00: " + str(prices.iloc))
log.info("Changes: " + str(changes))
log.info("Exchange Time: " + str(context.exchange_time))

def react_1230(context, data):
if context.changes > 0:
order_target_percent(context.stock, 1)
log.info("LONG " + str(context.stock))
elif context.changes < 0:
order_target_percent(context.stock, -1)
log.info("SHORT " + str(context.stock))
else:
order(sid(8554), 0)

def react_1300(context, data):
order_target_percent(context.stock, 0)
log.info(str(pd.Timestamp(get_datetime()).tz_convert('US/Eastern')) + "  REBALANCE " + str(context.stock))


There was a runtime error.

If you flip the trading logic (buy vs sellshort) and still get a P&L curve like this, then you have a friction problem; slippage and commissions are overwhelming any profit that might exist.

You would also want to apply filters > .5% to buy, < -.5% to sell.

And nothing says that this mechanism should work... .