Back to Community
Directed Acyclic Graph DAG - research strategy

On behalf of the author the attached research strategy is offered.

In the April 2015 issue of "Stocks & Commodities" magazine (www.traders.com) an article, "Basket Trading Using a Directed Acyclic Graph", is presented outlining a means of using a type of network graph called a directed acyclic graph or DAG in choosing a basket of securities for trading a continuously rebalanced portfolio.

The author wishes to thank Quantopian for providing such a useful platform and hopes to offer future articles and technical market explorations on their platform. The attached strategies leverage code and techniques from the open community and the author wishes to thank all members of the Quantopian quantosphere for their willingness to share their insight and hard work.

Quantopian provided an open and capable venue for which to share a python version of the tradable strategy used to test the groups built by the code outlined in the article. This code is provided here to assist any of those interested in the testing and examination of the trading technique. The DAG code itself is not offered here. But were solutions offered to replicate the technique in the article I'm sure all would benefit.

This strategy uses somewhat dated techniques (Security and SecurityManager) as it was written the end of last summer (the delay is part of the publishing machine).

As the April 2015 issue is, as of now (March, 19th), undergoing distribution, requests for comments related specifically to article content will have to be deferred until the issue has had a chance to be delivered. All comments regarding errors and omissions are welcome.

Clone Algorithm
261
Loading...
Backtest from to with initial capital
Total Returns
--
Alpha
--
Beta
--
Sharpe
--
Sortino
--
Max Drawdown
--
Benchmark Returns
--
Volatility
--
Returns 1 Month 3 Month 6 Month 12 Month
Alpha 1 Month 3 Month 6 Month 12 Month
Beta 1 Month 3 Month 6 Month 12 Month
Sharpe 1 Month 3 Month 6 Month 12 Month
Sortino 1 Month 3 Month 6 Month 12 Month
Volatility 1 Month 3 Month 6 Month 12 Month
Max Drawdown 1 Month 3 Month 6 Month 12 Month
import math
import numpy
import talib
import collections

trendPeriods = 14
rsiIndicator = ta.RSI(timeperiod = trendPeriods)
dollarDelta = 1000
riskReductionFactor = 2.0 # 1 = parity, 2 = lower, .5 = higher
entryStyle = "Value" ### Average, Value, Strength

def initialize(context):
    secMgr = SecurityManager()
    
    # Original DAG group
    # DAG: +.96 and up positively correllated
    secMgr.Add("ACT", sid( 8572))
    secMgr.Add("AIZ", sid( 25955))
    secMgr.Add("AMP", sid( 27676))
    secMgr.Add("ECL", sid( 2427))
    secMgr.Add("ETFC",sid( 15474))
    secMgr.Add("LLL", sid( 18738))
    secMgr.Add("LMT", sid( 12691))
    secMgr.Add("LNC", sid( 4498))
    secMgr.Add("MA",  sid( 32146))
    secMgr.Add("MMC", sid( 4914))
    secMgr.Add("NOC", sid( 5387))
    secMgr.Add("PFG", sid( 23151))
    secMgr.Add("PNR", sid( 6082))
    secMgr.Add("PRU", sid( 23328))
    secMgr.Add("SNA", sid( 6976))
    secMgr.Add("TJX", sid( 7457))
    secMgr.Add("TMK", sid( 7488))
    secMgr.Add("TMO", sid( 7493))
    
    context.SecMgr = secMgr
    context.period = 0
    context.runningPnL = 0.0

    set_commission(commission.PerTrade(cost=1.0))
    set_slippage(TradeAtTheOpenSlippageModel(.1))
    
def handle_data(context, data):
    context.SecMgr.Update(data)    
    context.period += 1
    
    if (context.period < trendPeriods):
        return

    if (abs(context.portfolio.pnl - context.runningPnL) > dollarDelta or len(context.portfolio.positions) == 0):
        context.runningPnL = context.portfolio.pnl        
        for security in context.SecMgr.GetSecurities():
            if (security.Enabled):
                order_target_percent(security.Sid, security.Weight)
            else:
                order_target_percent(security.Sid, 0.0)
        
    #record(PnL=context.portfolio.pnl)
    record(Leverage = context.account.leverage)
    
################################################################                
class SecurityManager(object):
    '''Class to wrap securities'''

    def __init__(this):
        this.stockList = {}
        
    def __str__(this):
        toString = "\tSymbols:{0}\n".format(this.stockList.keys())
        return toString 
    
    def Count(this):
        return len(this.GetSecurities())
    
    def Add(this, symbol, sid):
         this.stockList[symbol] = Security(symbol, sid)

    def Update(this, data):
        rsiValues = rsiIndicator(data)
        maxRsi = numpy.max(rsiValues)
        minRsi = numpy.min(rsiValues)
        avgRsi = numpy.mean(rsiValues)
        if (numpy.isnan(maxRsi) or numpy.isnan(minRsi) or numpy.isnan(avgRsi)):
            return

##### Method #1
#     Rebalances based on extremes of RSI, sell the highest, buy the lowest.
        if (avgRsi > 50):
            trendAdjuster = .05
        else:
            trendAdjuster = -.05

        totalWeight = 0.0
        count = 0.0
        for sec in this.stockList.values():
            if (sec.Sid not in data):
                sec.Weight = 0.0
                sec.Enabled = False
                continue
            sec.UpdatePrices(data)
            if (numpy.isnan(rsiValues[sec.Sid])):
                continue
            if (rsiValues[sec.Sid] > maxRsi - 5):
                sec.SetWeight(-.1 + trendAdjuster)
                totalWeight += abs(-.1 + trendAdjuster)
                count += 1
            elif(rsiValues[sec.Sid] < minRsi + 5):
                sec.SetWeight(.1 + trendAdjuster)
                totalWeight += abs(.1 + trendAdjuster)
                count += 1
            else:
                sec.SetWeight(0)
        if (count == 0):
            return
        
        weightAdjustment = (1.0 - totalWeight) / count
        
        for sec in this.stockList.values():
            if (sec.Weight < 0.0):
                sec.Weight -= weightAdjustment
            elif (sec.Weight > 0.0): 
                sec.Weight += weightAdjustment
#            #print(sec)
#####

##### Method # 2
#
#        # Shift avg RSI UP if trending up to BUY more of the list, 
#        # Shift avg RSI DOWN if trending down to sell more of the list
#        if (avgRsi > 40):
#            avgRsi += 10
#        else:
#            avgRsi -= 10
#        
#        totalWeight = 0.0
#        count = 0.0
#        for sec in this.stockList.values():
#            if (sec.Sid not in data):
#                sec.Weight = 0.0
#                sec.Enabled = False
#                continue
#            sec.UpdatePrices(data)
#            if (numpy.isnan(rsiValues[sec.Sid])):
#                continue
#            
#            if (entryStyle == "Average"):
#                # Set the initial weight to a positive value when below RSI, and a negative value for above RSI
#                weight = abs(minRsi - rsiValues[sec.Sid])
#            elif (entryStyle == "Value"):
#                weight = abs(maxRsi - rsiValues[sec.Sid])
#            elif (entryStyle == "Strength"):
#                weight = abs(minRsi - rsiValues[sec.Sid])
#            
#            totalWeight += abs(weight)
#            sec.SetWeight(weight)
#            count += 1
#
#        if (count == 0):
#            return
#
#        for sec in this.stockList.values():
#            if (sec.Enabled):
#                sec.Weight = sec.Weight / (totalWeight * riskReductionFactor)
#            #print(sec)
#####

#        record(MaxRsi=maxRsi, MinRsi=minRsi, AvgRsi=avgRsi, TotalWeight=totalWeight)
        
    def GetSecurities(this):
        return this.stockList.values()

   
#################################################################
class Security(object):
    '''Class to wrap security'''

    def __init__(this, symbol, sid):
        this.Symbol = symbol
        this.Sid = sid
        this.Weight = 0.0
        this.Enabled = True
        this.Open =  collections.deque(maxlen=trendPeriods) 
        this.High =  collections.deque(maxlen=trendPeriods) 
        this.Low =   collections.deque(maxlen=trendPeriods)  
        this.Close = collections.deque(maxlen=trendPeriods)        
            
    def __str__(this):
        toString = "\tSymbol:{0} weight:{1}\n".format(this.Symbol, this.Weight)
        return toString 
    
    def UpdatePrices(this, data):
        this.Open.append(data[this.Sid].open_price)
        this.High.append(data[this.Sid].high)
        this.Low.append(data[this.Sid].low)
        this.Close.append(data[this.Sid].close_price)
            
    def SetWeight(this, rsiValue):
        this.Weight = rsiValue
        this.Enabled = True
        pass
    
########################################################    
class TradeAtTheOpenSlippageModel(slippage.SlippageModel):
    def __init__(this, fractionOfOpenCloseRange):
        this.fractionOfOpenCloseRange = fractionOfOpenCloseRange

    def process_order(this, trade_bar, order):
        openPrice = trade_bar.open_price
        closePrice = trade_bar.price
        ocRange = closePrice - openPrice
        ocRange = ocRange * this.fractionOfOpenCloseRange
        targetExecutionPrice = openPrice + ocRange
            
        # Create the transaction using the new price we've calculated.
        return slippage.create_transaction(
            trade_bar,
            order,
            targetExecutionPrice,
            order.amount
        )
        
'''
Archive

    # Original DAG group
    # DAG: +.96 and up positively correllated
    #secMgr.Add("ACT", sid( 8572))
    #secMgr.Add("AIZ", sid( 25955))
    #secMgr.Add("AMP", sid( 27676))
    #secMgr.Add("ECL", sid( 2427))
    #secMgr.Add("ETFC",sid( 15474))
    #secMgr.Add("LLL", sid( 18738))
    #secMgr.Add("LMT", sid( 12691))
    #secMgr.Add("LNC", sid( 4498))
    #secMgr.Add("MA",  sid( 32146))
    #secMgr.Add("MMC", sid( 4914))
    #secMgr.Add("NOC", sid( 5387))
    #secMgr.Add("PFG", sid( 23151))
    #secMgr.Add("PNR", sid( 6082))
    #secMgr.Add("PRU", sid( 23328))
    #secMgr.Add("SNA", sid( 6976))
    #secMgr.Add("TJX", sid( 7457))
    #secMgr.Add("TMK", sid( 7488))
    #secMgr.Add("TMO", sid( 7493))

    # DAG: .92+ correlated
    #secMgr.Add("AIZ ", sid(25955))
    #secMgr.Add("AZO ", sid(693))  
    #secMgr.Add("FRX ", sid(3014)) 
    #secMgr.Add("GNW ", sid(26323))
    #secMgr.Add("HP  ", sid(3647)) 
    #secMgr.Add("HRS ", sid(3676)) 
    #secMgr.Add("IR  ", sid(4010)) 
    #secMgr.Add("JCI ", sid(4117)) 
    #secMgr.Add("MS  ", sid(17080))
    #secMgr.Add("PBI ", sid(5773)) 
    #secMgr.Add("PNR ", sid(6082)) 
    #secMgr.Add("PRGO", sid(6161)) 
    #secMgr.Add("PX  ", sid(6272)) 
    #secMgr.Add("R   ", sid(6326)) 
    #secMgr.Add("STZ ", sid(24873))
    #secMgr.Add("TJX ", sid(7457)) 
    #secMgr.Add("TMO ", sid(7493)) 
    #secMgr.Add("TYC ", sid(7679)) 
    #secMgr.Add("X   ", sid(8329)) 
    #secMgr.Add("ZMH ", sid(23047))
    
    # WealthFront ETF selection
    #secMgr.Add("VTI", sid(22739))# US Stocks
    #secMgr.Add("VEA", sid(34385))# Foreign Stocks
    #secMgr.Add("VWO", sid(27102))# Emerging Markets
    #secMgr.Add("VNQ", sid(26669))# Real Estate
    #secMgr.Add("DJP", sid(24700))# Natural Resources
    #secMgr.Add("BND", sid(33652))# Bonds
    
    # X type ETFs group
    #secMgr.Add("XLB", sid(19654)) # Materials Select Sector SPDR
    #secMgr.Add("XLE", sid(19655)) # Energy Select Sector SPDR                
    #secMgr.Add("XLF", sid(19656)) # Financial Select Sector SPDR             
    #secMgr.Add("XLI", sid(19657)) # Industrial Select Sector SPDR            
    #secMgr.Add("XLK", sid(19658)) # Technology Select Sector SPDR            
    #secMgr.Add("XLP", sid(19659)) # Consumer Staples Select Sector SPDR      
    #secMgr.Add("XLU", sid(19660)) # Utilities Select Sector SPDR             
    #secMgr.Add("XLV", sid(19661)) # Healthcare Select Sector SPDR            
    #secMgr.Add("XLY", sid(19662)) # Consumer Discretionary Select Sector SPDR
    #secMgr.Add("AGG", sid(25485)) # ISHARES CORE U.S. AGGREGATE BONDS     
    
    # DAG: Uncorrelated
    #secMgr.Add("INTC", sid(3951))
    #secMgr.Add("ITW ", sid(4080))
    #secMgr.Add("LLL ", sid(18738))
    #secMgr.Add("LO  ", sid(36346))
    #secMgr.Add("MA  ", sid(32146))
    #secMgr.Add("MO  ", sid(4954))
    #secMgr.Add("MON ", sid(22140))
    #secMgr.Add("MSFT", sid(5061))
    #secMgr.Add("MSI ", sid(4974))
    #secMgr.Add("OMC ", sid(5651))
    #secMgr.Add("PBI ", sid(5773))
    #secMgr.Add("PX  ", sid(6272))
    #secMgr.Add("ROK ", sid(6536))
    #secMgr.Add("SNA ", sid(6976))
    #secMgr.Add("STZ ", sid(24873))
    #secMgr.Add("TMK ", sid(7488))
    #secMgr.Add("TMO ", sid(7493))
    #secMgr.Add("UNM ", sid(7797))
    #secMgr.Add("VMC ", sid(7998))
    #secMgr.Add("VRSN", sid(18221))
    
    # DAG: -.65 to -.96 negatively correlated
    #secMgr.Add("AMZN", sid(16841) )
    #secMgr.Add("APC ", sid(455))
    #secMgr.Add("AVP ", sid(660))
    #secMgr.Add("CA  ", sid(1209))
    #secMgr.Add("CELG", sid(1406))
    #secMgr.Add("DGX ", sid(16348))
    #secMgr.Add("DNR ", sid(15789))
    #secMgr.Add("DO  ", sid(13635))
    #secMgr.Add("F   ", sid(2673))
    #secMgr.Add("FDO ", sid(2760))
    #secMgr.Add("FE  ", sid(17850))
    #secMgr.Add("HCN ", sid(3488))
    #secMgr.Add("HCP ", sid(3490))
    #secMgr.Add("JBL ", sid(8831))
    #secMgr.Add("JCP ", sid(4118))
    #secMgr.Add("MCD ", sid(4707))
    #secMgr.Add("NEM ", sid(5261))
    #secMgr.Add("NTAP", sid(13905))
    #secMgr.Add("TDC ", sid(34661))
    #secMgr.Add("UTX ", sid(7883))
    
    # Original DAG group
    # DAG: +.96 and up positively correllated
    #secMgr.Add("ACT", sid( 8572))
    #secMgr.Add("AIZ", sid( 25955))
    #secMgr.Add("AMP", sid( 27676))
    #secMgr.Add("ECL", sid( 2427))
    #secMgr.Add("ETFC",sid( 15474))
    #secMgr.Add("LLL", sid( 18738))
    #secMgr.Add("LMT", sid( 12691))
    #secMgr.Add("LNC", sid( 4498))
    #secMgr.Add("MA",  sid( 32146))
    #secMgr.Add("MMC", sid( 4914))
    #secMgr.Add("NOC", sid( 5387))
    #secMgr.Add("PFG", sid( 23151))
    #secMgr.Add("PNR", sid( 6082))
    #secMgr.Add("PRU", sid( 23328))
    #secMgr.Add("SNA", sid( 6976))
    #secMgr.Add("TJX", sid( 7457))
    #secMgr.Add("TMK", sid( 7488))
    #secMgr.Add("TMO", sid( 7493))

    # DAG: .92+ correlated
    #secMgr.Add("AIZ ", sid(25955))
    #secMgr.Add("AZO ", sid(693))  
    #secMgr.Add("FRX ", sid(3014)) 
    #secMgr.Add("GNW ", sid(26323))
    #secMgr.Add("HP  ", sid(3647)) 
    #secMgr.Add("HRS ", sid(3676)) 
    #secMgr.Add("IR  ", sid(4010)) 
    #secMgr.Add("JCI ", sid(4117)) 
    #secMgr.Add("MS  ", sid(17080))
    #secMgr.Add("PBI ", sid(5773)) 
    #secMgr.Add("PNR ", sid(6082)) 
    #secMgr.Add("PRGO", sid(6161)) 
    #secMgr.Add("PX  ", sid(6272)) 
    #secMgr.Add("R   ", sid(6326)) 
    #secMgr.Add("STZ ", sid(24873))
    #secMgr.Add("TJX ", sid(7457)) 
    #secMgr.Add("TMO ", sid(7493)) 
    #secMgr.Add("TYC ", sid(7679)) 
    #secMgr.Add("X   ", sid(8329)) 
    #secMgr.Add("ZMH ", sid(23047))
    
    # Alpha selection from SP250
    #secMgr.Add("FOXA",sid(12213))
    #secMgr.Add("FRX" ,sid(3014))
    #secMgr.Add("GAS" ,sid(3103))
    #secMgr.Add("GCI" ,sid(3128))
    #secMgr.Add("GD"  ,sid(3136))
    #secMgr.Add("GE"  ,sid(3149))
    #secMgr.Add("GIS" ,sid(3214))
    #secMgr.Add("GLW" ,sid(3241))
    #secMgr.Add("GPC" ,sid(3306))
    #secMgr.Add("GPS" ,sid(3321))
    #secMgr.Add("GT"  ,sid(3384))
    #secMgr.Add("GWW" ,sid(3421))
    #secMgr.Add("HAL" ,sid(3443))
    #secMgr.Add("HAS" ,sid(3460))
    #secMgr.Add("HBAN",sid(3472))
    #secMgr.Add("HCN" ,sid(3488))
    #secMgr.Add("HCP" ,sid(3490))
    #secMgr.Add("HD"  ,sid(3496))
    #secMgr.Add("HES" ,sid(216))
    #secMgr.Add("HON" ,sid(25090))
    #secMgr.Add("HP"  ,sid(3647))


'''
There was a runtime error.
30 responses

Hope the release of the article goes well
This could be one of those opportunities to be tackling some of the nitty-gritty intricacies of shorting and maybe even some code by someone for handling any extra costs of selling borrowed shares, if that's possible, and/or other things involved in shorting. Maybe time frames? I'm not sure how all of that works, like when settling comes due etc.

Market Tech,

Thanks, that looks really interesting and I think it'd be great if this would happen much more often as it's quite difficult to try and reimplement a strategy from a paper. In addition, it would be very interesting to also publish a continuously updated forward test of the strategy alongside the paper.

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

@Thomas W., If there were DAG code in Python then the dynamic reconstruction of the DAG group from the correlated pairs would undoubtedly be possible. And advisable. Were such capability to be found or written it sure would be nice to have a better file structure into which one could store often used code modules. I sure hope that feature is at the top of the list for updates. Unless the research facility is going to offer such a capability that code modules can be bolted to the side of any strategy...?

@Market Tech: Do you think that your analysis could be done with the help of the networkx library (https://networkx.github.io/)?

@Thomas W., I see the reference to DAGs in your link: algorithms.dag and I see some of the familiar verbiage regarding nodes (vertexes) and edges. But I'm not sure, on a cursory look, how to retrieve the critical path from the DAG. If you can figure out how to do that then it looks feasible. Correlation is easy to come by I would guess. And that's all you need.
The article relied upon the entire SP500 as its input set but the 200 available from data[] (or more if one uses the fundamental technique Grant discovered), would be adequate. A walk forward reselection + rebalance test would be educational.

doesn't do so well in bear years, try 2007 - 2009

Well, testor testor, I don't doubt it. The intent was not to show the efficacy of the strategy, but to demonstrate the technique of using a DAG generated group. Perhaps if you have a better rebalancing strategy you'd like to share, the groups referenced within the code offered could be tested to see if the technique holds up regardless of the strategy used.

@Market Tech,

I'd suggest thinking about using zipline in the research platform, where there is no limit on the number of securities, and you'll have more flexibility.

There is a limit on RAM, but if you run into trouble, I'm sure Fawce would pay for a few more GB's, given that he's using your work to promote Quantopian (https://twitter.com/fawceisfawce/status/578953140973805570).

Grant

@Grant, Thanks for the recommendation. Coding in python is still like knitting with ovenmitts on for me. Everything I've ever presented here or anywhere I wrote in C# first. Slowly though, I'm starting to think in python, which, as we know, is key to becoming adept in a language. The one thing that Quantopian offers, that I've commented on before, that is nearly impossible on any other platform, is this drop-dead-simple portfolio management and testing.

RE: the articles; the February article was essentially ignored (in the TA world that is (here too though)). The fact that it focused on candlesticks put quants off I think. But the article was all about how to coerce OHLC numbers into statistically measurable patterns that could be tabulated and used as probability fodder. Regardless, it made the cover image! A python slithering over a candlestick chart. That was all tribute to Quantopian (in a way).

April's issue garnered the same treatement! www.traders.com The cover image is the Quantopian referenced article - DAG (nabbit!) I'd write more but, well, it doesn't pay. You should write for them. Seriously. If enough quants here wrote articles S&C might make a monthly Quantopian section...

No articles for me. Got plenty to do. Congrats on getting published. --Grant

Just a quick question: If I run the algorithm in the minute mode, Will the RSI, which is calculated based on the following lines of code, be the RSI over the past 14 minutes??

trendPeriods = 14
rsiIndicator = ta.RSI(timeperiod = trendPeriods)

The reason I am asking this, is becoz when I try to run the algorithm in minute mode, I get a very poor performance. Even, when I use schedule function an only make trade at the end of the day, the performance is really bad and the transaction are so different from the transactions in daily mode...

Is there anyway, I can run the algo in the minute mode (for example for paper trading), and calculate the RSI parameters in daily mode ??

Thanks

@Ali N., I'm going to spend an hour, soon..., and rewrite the strategy so that the methods for rebalance will run on any periodicity. Note that this strategy is only a research tool to gauge relative group performance in the hopes of determining the usefulness of using a DAG style of security selection.

I subscribed to S&C mainly off the back of your last article and this looks equally interesting, thanks.

@ Market tech. I totally understand that at this point the strategy is more or less like a research topic. I wanna do some work on top of what you did. I did lots of research in grad school about DAGs and I believe it can be potentially a promising path. However, first I love to run it properly and then take further steps. Since it is your code, I think it will be easier for you to make the changes for different frequency rebalancing. I take it from there and for sure I will share with you where I will end up too

@Mark L. I'm happy you found the information useful.

Ali N., I'm in the throes of trying to rework this technique such that we achieve the same results as it currently exhibits, but in a modular, extensible fashion.

There are many(*) basket trading strategies here. I would hope that perhaps we could find some of those and test the various groups referenced in this strategy using them. Maybe they work, maybe they don't. But it might give you clarity while I get this redone.

  • that's what Quantopian excels at - basket rebalancing strategies.

@Market Tech. Sure.. Sounds good. I actually have been trying to rewrite your code since 2 hours ago and I had decent progress. If you are working on it, I can pause and work on sth else ( as you suggested)

However, the issue that I have with the names you selected, is that you selected the names at a specific time, and not necessarily the current time. for testing, it wont matter that much, but I would be a bit concerned about the in-sample or out-of-sample calibration. Do you have the DAG graph code in Python?

@Ali N., Feel free to proceed however you believe you would benefit most. My time is limited at this point...

Yes, this specific test was just a snapshot in time. The correlated pairs were pulled from 2010 (the whole year) and the applied forward and backward. But you're right, to use this properly one would want to recalc the DAG every so often. Thomas W. above found some potential DAG code in a python package. I don't know much about it (or if it's even available here). The C# DAG code is pretty straight forward, but I'm not up to the task of porting it.

Here is a converted version of the DAG research strategy. It generally replicates the original.

A few comments. Commissions are a major issue in this strategy. The Q's exorbitant default of 3cents per share, while conservative, is unrealistic in this strategy. $2.00 a trade, $4.00 a round trip is reasonable I'd wager. So that what this strategy is set at.

Additionally, I have a stinking suspicion that some of the returns in this strategy are due to a dividend or split bug (recently discussed elsewhere here).

Regardless, the new structure provides for other portion calculations to be plugged in and tested. A novel way one might treat these is to somehow monitor all portion calcs for their individual returns, and dynamically choose which version to choose, from time to time.

Clone Algorithm
60
Loading...
Backtest from to with initial capital
Total Returns
--
Alpha
--
Beta
--
Sharpe
--
Sortino
--
Max Drawdown
--
Benchmark Returns
--
Volatility
--
Returns 1 Month 3 Month 6 Month 12 Month
Alpha 1 Month 3 Month 6 Month 12 Month
Beta 1 Month 3 Month 6 Month 12 Month
Sharpe 1 Month 3 Month 6 Month 12 Month
Sortino 1 Month 3 Month 6 Month 12 Month
Volatility 1 Month 3 Month 6 Month 12 Month
Max Drawdown 1 Month 3 Month 6 Month 12 Month
import math
import numpy
import pandas
import talib
import zipline

RSIPeriods  = 14
DollarDelta = 1000

def initialize(context):
    set_symbol_lookup_date('2015-01-01')
    # Original DAG group, CORR: +.96 and up positively correllated
    #symbols("SPY", "SHY")
    symbols("ACT","AIZ","AMP","ECL","ETFC","LLL","LMT","LNC","MA",
            "MMC","NOC","PFG","PNR","PRU","SNA","TJX","TMK","TMO")
    context.runningPnL = 0.0
    context.S = {}
    
    schedule_function(CalculatePortions_A, time_rule=time_rules.market_open(minutes=10))
    #schedule_function(CalculatePortions_B, time_rule=time_rules.market_open(minutes=10))    
    schedule_function(HandleRebalance,     time_rule=time_rules.market_open(minutes=10))
    
    set_commission(commission.PerTrade(cost=2.0))
    
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~    
def handle_data(context, data):
    record(Leverage = context.account.leverage)
    for stock in data:
        if (stock in context.S):
            continue
        sidData = zipline.protocol.SIDData(stock)
        sidData.Enabled  = False
        sidData.Weight   = 0.0
        context.S[stock] = sidData

#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
def HandleRebalance(context, data):
    if (abs(context.portfolio.pnl - context.runningPnL) < DollarDelta and len(context.portfolio.positions) != 0):
        return
    context.runningPnL = context.portfolio.pnl
        
    for stock in context.S:
        if (stock not in data):
            continue
        if (context.S[stock].Enabled):
            order_target_percent(stock, context.S[stock].Weight)
            print("{0:>5} : {1}".format(stock.symbol, context.S[stock].Weight))
        else:
            order_target_percent(stock, 0.0)
            print("{0:>5} : {1}".format(stock.symbol, context.S[stock].Weight))                 

#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
def CalculatePortions_A(context, data):

    closeDeck = history(RSIPeriods + 1, "1d", "close_price").dropna(axis=1)
    closeDeck = closeDeck[[sid for sid in closeDeck if sid in data]]
    rsiValues = closeDeck.apply(talib.RSI, timeperiod = RSIPeriods)
    rsiValues = rsiValues.dropna()
    if (len(rsiValues) == 0):
        print("No rsi data")
        return
   
    maxRsi = rsiValues.iloc[-1].max()
    minRsi = rsiValues.iloc[-1].min()
    avgRsi = rsiValues.iloc[-1].mean()
    if (numpy.isnan(maxRsi) or numpy.isnan(minRsi) or numpy.isnan(avgRsi)):
       return

    trendAdjuster = .05 if avgRsi > 50 else -.05
    
    count = 0
    totalWeight = 0.0
    for stock in context.S:
        if (stock not in data or stock not in rsiValues):
            context.S[stock].Weight = 0.0
            context.S[stock].Enabled = False
            continue
        context.S[stock].Enabled = True            
        stockRsi = rsiValues[stock].iloc[-1]
        if (stockRsi > maxRsi - 5):
            context.S[stock].Weight = -.1 + trendAdjuster
            totalWeight += abs(-.1 + trendAdjuster)
            count += 1
        elif(stockRsi < minRsi + 5):
            context.S[stock].Weight = .1 + trendAdjuster
            totalWeight += abs(.1 + trendAdjuster)
            count += 1            
        else:
            context.S[stock].Weight = 0.0

    weightAdjustment = (1.0 - totalWeight) / count
    
    for stock in context.S:
        if (context.S[stock].Weight < 0.0):
            context.S[stock].Weight -= weightAdjustment
        elif (context.S[stock].Weight > 0.0): 
            context.S[stock].Weight += weightAdjustment
            
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
def CalculatePortions_B(context, data):
    closeDeck = history(RSIPeriods + 1, "1d", "close_price").dropna(axis=1)
    closeDeck = closeDeck[[sid for sid in closeDeck if sid in data]]
    rsiValues = closeDeck.apply(talib.RSI, timeperiod = RSIPeriods)
    rsiValues = rsiValues.dropna()
    if (len(rsiValues) == 0):
        print("No rsi data")
        return

    maxRsi = rsiValues.iloc[-1].max()
    minRsi = rsiValues.iloc[-1].min()
    avgRsi = rsiValues.iloc[-1].mean()
    if (numpy.isnan(maxRsi) or numpy.isnan(minRsi) or numpy.isnan(avgRsi)):
       return
    
    # Buy more top ranked RSI stocks (strength)
    minRsi = minRsi - 10 if avgRsi < 50 else minRsi + 10
    # Buy more lower ranked RSI stocks (value)
    maxRsi = maxRsi + 10 if avgRsi < 50 else maxRsi - 10
    
    count = 0
    totalWeight = 0.0
    for stock in context.S:
        if (stock not in data or stock not in rsiValues):
            context.S[stock].Weight = 0.0
            context.S[stock].Enabled = False
            continue
        context.S[stock].Enabled = True
        if (avgRsi < 45):
            context.S[stock].Weight = 0.0
        else:
            weight = abs(minRsi - rsiValues[stock].iloc[-1]) # or use maxRsi here
            context.S[stock].Weight = weight
            totalWeight += weight
            count += 1

    if (count == 0):
        return
    
    for stock in context.S:
        if (context.S[stock].Enabled):
            context.S[stock].Weight = context.S[stock].Weight / totalWeight


There was a runtime error.

Thanks a lot. I believe the issue I raised about the minute mode still stands. I think it is more or less due to the minute mode RSI calculation.

@Thomas Wiecki - I think using NetworkX, though there's no "longest path" function (that I've found), you can multiply your weights by -1 and use the bellman_ford function.

I just ran this latest algo through minutely data (last 3 months) and it appeared to work without issue. Not well. But it did work. The RSI calc is still on daily data (as it should be).

@ Market Tech.

I see it is working but the result of the minute date and daily data look very different. I assumed that if we run the algo in the minute mode and use schedule function to rebalance, and make the trade 2 minutes before the closing, then the result of the minute mode and the daily mode should be very similar. But it is not the case..

Well, one thing is that one must remember that when you use history that the most recent period pulled using minutely data is TODAY's price. Which for this treatment would not be the price we'd want to calculate the RSI on. So one has to pull one more day extra then trim the the last period from it.

closeDeck = history(RSIPeriods + 2, "1d", "close_price").dropna(axis=1)[0 : RSIPeriods + 1]  

I'm still researching why the minutely vs the daily diverge so far. (Unfortunately I've seen this many times before and been unable to deduce the discrepancy).

Yeah. I dont understand how it works either. If the difference was slight, it would make sense, but here the differnce is crazy if you do your backtest from 2002 till now. I actually added the slippage mode for the daily backtest and the return is about 1600%, and in minute mode, the return is very small...

How did you determine the direction of an edge between two securities on the DAG? Or is it just an arbitrary direction that doesn't really matter?

Direction should not matter at all. Simply each edge should have both directions

For the article he edge between vertices was correlation. But any common pairs measurement would do. And as such there is no direction of weight or weight during group selection. Trading the group however any divergent measurement would work. And such measurement would have direction - as determined by your strategy. That is, leaders vs laggards. For a mean reverting strat, leaders are directed down and laggards up.

Also, remember this is a DAG so no cycles are allowed in constructing the group's longest path (critical path).

4 years ago the article by Dave Cline "Basket Trading Using A Directed Acyclic Graph" in APR 2015 STOCKS & COMMODITIES magazine bring me to Quantopian.

Despite the fact that "Directed Acyclic Graph DAG - research strategy" by Market Tech has nothing to do with the above article, I ported it in Q2 simplified and modified for my own (not Quantopian) strategy requirements:
Significantly outperform the market with reasonable beta (< 0.25), drowdown (< 0.25) and sharpe ratio (> 1.0) in full market cycle.

# Directed Acyclic Graph DAG simplified and modified  
# https://www.quantopian.com/posts/directed-acyclic-graph-dag-research-strategy#5c5e9e421f902a0046a56511  
import numpy  
import talib  
# --------------------------------------------------------------  
assets, RSIPeriods, lev, h = symbols('QQQ', 'TLT'), 14, 1.0, 0.2  
# --------------------------------------------------------------  
def initialize(context):  
    schedule_function(trade, date_rules.week_start(), time_rules.market_open(minutes = 65))  

def trade(context, data):  
    if get_open_orders(): return

    prices = data.history(assets, 'price', RSIPeriods + 1, '1d')  
    rsiValues = prices.apply(talib.RSI, timeperiod = RSIPeriods)  
    rsiValues = rsiValues.dropna()  
    if (len(rsiValues) == 0): return

    maxRsi = rsiValues.iloc[-1].max()  
    minRsi = rsiValues.iloc[-1].min()  
    avgRsi = rsiValues.iloc[-1].mean()

    if (numpy.isnan(maxRsi) or numpy.isnan(minRsi) or numpy.isnan(avgRsi)): return  
    raw_wt = 1.0 / len(assets)  
    trendAdjuster = .05 if avgRsi > 50 else -.05

    weight = {}; totalWeight = 0.0; wt = {};     

    for sec in assets:  
        stockRsi = rsiValues[sec].iloc[-1]  
        if (not data.can_trade(sec) or sec not in rsiValues):  
            wt[sec] = 0.0  
        elif (stockRsi > maxRsi - 5):  
            weight[sec] = raw_wt*(1.0 - h) + trendAdjuster  
            totalWeight += abs(raw_wt*(1.0 - h) + trendAdjuster)  
        elif(stockRsi < minRsi + 5):  
            weight[sec] = raw_wt*h + trendAdjuster  
            totalWeight += abs(raw_wt*h + trendAdjuster)  
        else: return 

    for sec in assets:  
        if data.can_trade(sec):  
            wt[sec] = weight[sec] / totalWeight if totalWeight > 0 else 0  
            order_target_percent(sec, wt[sec])  
            record(**{sec.symbol: wt[sec]}) 

    record(leverage = context.account.leverage)  
'''
1000000  
START  
06/01/2007  
END  
02/07/2019

assets, RSIPeriods, lev, h = symbols('QQQ', 'TLT'), 14, 1.0, 0.2  
schedule_function(trade, date_rules.week_start(), time_rules.market_open(minutes = 65)) 

Total Returns  
320.07%  
Benchmark Returns  
122.9%  
Alpha  
0.11  
Beta  
0.18  
Sharpe  
1.13  
Sortino  
1.66  
Volatility  
0.11  
Max Drawdown  
-14.45%

'''
Loading notebook preview...
Notebook previews are currently unavailable.

Hey Vladimir,
Nice to see this strategy and concept revived.
Four years? Ouch! My, how time does scream by. The DAG was a crazy idea I dreamed up as I learned about the Facebook graph. I'd been deep into pairs and basket trading and wanted to see if they could be combined. The DAG was the answer. Now, I see your code trades only a pair of ETFs. The DAG expects one to trade a basket. But perhaps your code portrays the re-balancing of two of the basket components. Half of the DAG concept is actually creating the basket itself - that is, the network of "friends" of securities. Regardless, thanks for the flashback. (Oh, and Dave Cline and I may be the same person.)

Hi, Market Tech.

Glad to hear from you.
I agree: The DAG expects one to trade a basket of "friends".
But I find out that your strategy especially with my modification works better on pair of "enemies".