Back to Community
Stock split and Warnings.

Hey, I was hoping someone could tell me how to safeguard against stock splits?
Also, how do I protect agains this type of warning : "WARN Your order for ___ shares of ___ failed to fill by the end of day and was canceled."

Thank you.

15 responses

The best guard against stock splits bugs is in-time split adjusted data.
So join the line of those who is waiting fast, regular and reliable procedure of adjustments.

"WARN Your order for ___ shares of ___ failed to fill by the end of day and was canceled."

is the result of co called 'realistic' default slippage model which taking control of an algo on itself resulting in unrealistically unbalanced positions (long-short, stock-bond etc).

That unreality coming from wrong assumption that market order may be fully executed only on bars with positive volume 40 times greater then order.
In real world buy market orders may be executed instantly on volume of asks in the book and sell market orders may be executed instantly on volume of bids in the book .
Limit orders should not have any negative price impact by definition.

Christopher, you may disable default slippage model by printing in initialize:

But that will be other side of unreality.


The question to inventor of 2.5% volume impact I asked several times: where from you take that number?

Limit orders should not have any price impact by definition.

Limit orders do have some price impact but it's actually opposite of stop/market orders (to clarify this: both drive price to the same direction but limits act as floor/ceiling to the trend and stops/markets act as accelerator for the current direction) I don't think limit slippage is calculated correctly at all (and I share your views on 40x volume execution requirement - it's quite silly - in real world there is much more volume available than what is matched)

Maybe if we get enough people to join the thread and start complaining about how unintelligent the slippage model is, Quantopian will do something about it.

edit: actually, it's not "unintelligent," it's "crippling." it cripples your ability to write accurate models if you trade everyday.

Hey Toan,

I totally get your point about slippage in the context of a prop trading strategy. It takes a very conservative approach to market places that are generally more liquid. With that said, I am opposed to the idea of convening to get slippage altered. I wouldn't want the folks at Q working on something like slippage, when there are so many more cool things they could implement.

Have you tried adjusting slippage using the set_slippage() function? I guess my point is that I would rather have Q work on stuff that I don't already have the capability to adjust myself in one line of code. One thing that might make sense is to look at your fills in live trading and adjust your slippage based on the depth of the actual market you are trading. You would not even need live trading data for this. You could just write some code in your algo that prints out volume on a minute by minute basis, and then adjust based on the data you are seeing in the ticker you are trading. Something like this:

def handle_data(context, data):  
          volume = data.history(context.spy, 'volume')  
          print volume  

Frank, it's a fair point and i tend to agree. for me however, i don't have my own capital to invest so i was hoping to win the competition and get some notice. My current draw-down due to commission and slippage is about 40% every year ($400k on 1M) and that's hard to surpass.

also, to adjust the slippage to something more realistic is one line of code. (what that line is, i don't know) The current model heavily penalizes active trading. bad Toan. I thought you had mentioned something about actually trading. $400K sure does seem like a lot. Have you broken down those numbers between what accounts for slippage and what is commission? Basically running four different back tests:

1) With Q default slippage and commission
2) With 0 commission and 0 slippage
3) With default slippage and 0 commission
4) with 0 slippage and default commission

This way you could really isolate where these costs are coming from, and hopefully find a way to reduce those costs without harming your strategy.

Also ( this can be debated...), but if Q altered the slippage model, wouldn't everybody else see incremental gains in their strategy? Out of curiosity, does your strategy make use of some security that is not very liquid? Or does it trade very low priced securities? (feel free not to share, but I am curious as to why you are getting dinged so hard).

Frank, I have tried different filters in the pipeline (stocks with liquidity between 20-100 to liquidity between 85-100 and min price of > $15 and > $30). my number of transaction is somewhere between 300 and 2000 per day--so that's 7.5% to 50% in transaction costs alone. I'll go and try what you proposed for 1 year each and see what some of my algos come up with--should be interesting.

some of the times, the slippage will chop an order into something like 36 shares, 7 shares, 1 share and charge $3. doesn't sound like a lot but if it does it for 25% of 100 orders twice a day, 252 days a year, it adds up. again, i'll run your suggestion--I'll let you know.

oh. I just realized something. the slippage and commission play into each other. Ultimately it's the commission that's eating away at my numbers since the slippage invokes multiple fills orders.

@ Toan

300-2000 transactions a day seems like a lot, especially for a long/short equity fund to be entered in the contest. Today I purchased a tube of tooth paste, an order of chicken strips, a bag of pretzels, and a half-gallon of milk. A total of four items, but only two total transactions, because three of the items were purchased from the same store. Your 300-2000 transactions per day probably means that you are buying a half-gallon of milk when there is already a half empty bottle sitting in the fridge.

@Toan The fact that the per-trade commissions are being applied in each minute that a trade is filled is a bug. It should only be applied once per order. Thank you for pointing this out. We're working on a fix.


The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

To clarify, if an order is filled over multiple minute bars, each row in the "Transactions" table doesn't count as a new order (or at least it shouldn't) unless you are explicitly placing a new order with order() or another similar method in each subsequent bar.


I'll post an example. On a $1M portfolio, I did 50 sells and 50 buys-- 100 orders total, 252 days a year and we will spend $25k in commission, right? I got the following:

set_commission(commission.PerShare(cost=0.00, min_trade_cost=0)) -3.2%
set_commission(commission.PerShare(cost=0.00, min_trade_cost=1)) -59.4%
set to default: -59.7%

this implies i spent around $550k on commissions. looking over the transactions, i make about ~2000 transactions per day in this case, and if you assume $1 per transaction, you get about $500k in commission which is consistent with my back-tests.

Clone Algorithm
Backtest from to with initial capital
Total Returns
Max Drawdown
Benchmark Returns
Returns 1 Month 3 Month 6 Month 12 Month
Alpha 1 Month 3 Month 6 Month 12 Month
Beta 1 Month 3 Month 6 Month 12 Month
Sharpe 1 Month 3 Month 6 Month 12 Month
Sortino 1 Month 3 Month 6 Month 12 Month
Volatility 1 Month 3 Month 6 Month 12 Month
Max Drawdown 1 Month 3 Month 6 Month 12 Month
# Backtest ID: 573bc36d8a28570f87303151
There was a runtime error.

Hi Toan,

The distinction I was trying to make was between "orders" and "transactions" (in reference to rows in the "Transaction History" tab in a backtest). Each time you place an order, if it's filled at all (even if only partially), you will incur a commission cost equal to the per trade cost in a PerTrade model. In a PerShare model, you will incur the minimum trade cost only if your partial/complete fill doesn't exceed the minimum cost. This won't necessarily be decided in the first bar after an order is placed as an order can take several bars to fill.

The above description is what the backtester should be doing. It is considered a bug that you are charged the minimum commission cost (or the per trade cost) each minute if a "transaction" occurs as a continuation of a previous order. The fee should only be applied when the order is initially made. We are working on a fix for this. Let me know if this explanation helps.

Jamie, i think i understood it fine. I'm confused what in my phrasing led you to think that i'm confused? the above code was intended to show you that i was getting charged $1 for every line the the transaction log.

Hi Toan,

I incorrectly assumed that posting the backtest meant you were looking for more from my answer. Sorry about that! Thank you for demonstrating the issue.

so, it's bug, right? not a feature? =)