Back to Community
Fundamentals now available for Quantopian Open, Quantopian paper trading

In December, we introduced a key new feature: the availability of fundamental data from Morningstar directly within Quantopian. Upon announcement, you could use 600+ metrics covering US companies in your algorithms to screen companies dynamically.

At launch in December, this feature was restricted to backtesting. No live trading (Quantopian paper trading or broker trading) was yet allowed.

Good news — we are beginning to remove these restrictions.

As of today, you can now paper trade algorithms that use fundamental data via get_fundamentals(). Further, the before_trading_start() method is also available for use within paper trading.

And most importantly, by extension, you can enter the Quantopian Open contest with algorithms which use fundamentals.

Some detailed notes on this change:
before_trading_start() is scheduled to run every market day at 8:45AM, ET.

We don’t yet allow you to use fundamentals with broker live trading. We are working on this feature right now and I’ll update this thread when it is ready.

This release has fixed an issue in the data in which get_fundamentals() returned too few companies. Special thanks to Charles Cheng for pointing this out to us.

And most importantly, working on this release revealed a significant bug which we have now fixed. The impact of the bug was that get_fundamentals() could expose fundamental data one day early, resulting in a look-ahead bias in our backtesting. To be clear, this didn’t impact all algorithms or even all algorithms using fundamentals. We’d encourage you to revisit your old backtests using fundamentals if these bugs are a concern and of course I’m happy to answer questions on the topic.

I know lots and lots of you have been itching to use fundamentals in the contest. I’m excited to say that you can now.

Happy investing,


The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

28 responses

Hi Josh,

Nice feature...thanks!

I've started to tinker around with get_fundamentals() and update_universe() in backtests and I have a few questions:

  • On the help page, it says "fundamental data cannot yet be used with the history function." Is this still the case?
  • I've sorted out that securities can be dynamically added to an algo's universe, but only 200 per day (see my post and others on Why the 200 sid per day restriction?
  • The function update_universe() adds securities to the algo's universe, but is there no way to remove them? This could be a problem, since the algo could get bogged down over time if it grabs a different set of 200 sids every day.
  • Presumably, after a given security is added and one bar of data becomes available, the automatic forward filling of bars will start (e.g. Correct?


Hi Grant,

Glad to hear you've dug in on the feature. Some answers:

  • The statement on the history function still stands. We're actively working on a project to bring fundamentals over to our alpha Research environment. As part of this project, we're planning on extending the platform to deliver time series of fundamentals data and this feature, we plan, will also come back to the IDE for writing algos. In the meantime, you can check out some of the algos that Seong has posted in the forums that save data from previous periods: TL;DR: Not yet, we're working on it.
  • Why the 200 sid restriction for update_universe? Generally for reasons of system performance. At a macro level, performance is an area of investment for us. We're tracking ourselves publicly here: You'll notice that we've added two fundamentals-focused test cases recently and Eddie and team are working towards raising these limits for the system as a whole.
  • Correct. This is something you'll need to protect against in your algo for now.
  • I can't think of any reason fundamentals-based universes would behave differently.

Hope that helps,

Thanks Josh,

Regarding the history function & get_fundamentals(), I gather that history works in the sense that it will provide historical OHLCV bar data for all sids in the algo universe, but it won't provide a trailing window of historical company fundamental data. Is this correct? Or is it that the dataframe returned by history is static, only providing data for the initial sids specified in the algo, but not handling updates from update_universe()?



We are treating the feature request of "history() for get_fundamentals()" more as an indication of need and identification of an opportunity. We'll likely craft a solution that is separate from the existing history() function.

The likely solution will be to improve get_fundamentals() capable of returning a pandas panel when a time range is specified.


Thanks Josh,

I continue to be mystified by get_fundamentals(). The code below gives me an error:

Something went wrong. Sorry for the inconvenience. Try using the built-in debugger to analyze your code. If you would like help, send us an email.
KeyError: Security(32043, symbol='QVCB', security_name='LIBERTY MEDIA HOLDING CORP - INTERACTIVE', exchange='NASDAQ GLOBAL SELECT MARKET', start_date=Timestamp('2006-05-10 00:00:00+0000', tz='UTC'), end_date=Timestamp('2015-03-10 00:00:00+0000', tz='UTC'), first_traded=None)
There was a runtime error on line 28.

I'm using minute bar data, with a start date of 3/9/2013 and an end date of 03/09/2015. The error occurs on Line 28:


I'll send a help request in, as well.



def initialize(context):  
    context.master_list = []  
    schedule_function(trade, date_rules.every_day(), time_rules.market_open(minutes=60))  
def before_trading_start(context):  
    #Setup Tickers for NASDAQ's Largest 200 Companies by Market Cap  
    fundamental_df = get_fundamentals(  
        .filter(fundamentals.company_reference.primary_exchange_id == 'NAS')  
        .filter(fundamentals.valuation.market_cap != None)  
    context.stocks = [stock for stock in fundamental_df]  

def handle_data(context, data):  

def trade(context, data):

    for stock in context.stocks:  
    for stock in context.master_list:  
        if stock not in context.stocks:  


Thanks for digging into fundamentals.

One of the things that tripped me up at first is the fact that with fundamentals, you might be sending in orders for stocks from a list separately from the list of stocks with pricing data (i.e. the list of securities in the data object).

The list of securities from fundamentals (in this case, context.stocks) include securities that don't have a price at a particular pricing bar (especially if it is thinly traded) where you are attempting to place an order.

So you should protect against this scenario by wrapping your order calls in something like this:

if stock in data:  

Hope this makes sense,

Thanks Josh,

I don't understand why the backtester will not accept an order if there is no data for a given security in a bar. I thought that it just won't fill the order if there is no data, but why would it not accept the order? If I use your recommended approach, and place orders only once per day, then orders for thinly traded securities likely won't ever get filled. It seems like something ain't right here. Is it that a security needs to have at least one bar in data before orders will be accepted (so that you can start forward-filling from that bar)? In other words, is my algo getting tripped up because the forward filling hasn't kicked in yet? Is the 'if stock in data' checking for empty bars, or is it just holding off ordering until at least one trade exists, after which there would be data in every bar due to the forward filling?


What are the alternatives?
1) At the minute prior to start of trading, forward fill from the prior days close using zero volume.
2) Disconnect the order placement logic from the need to have data present for any ordered security.
3) Do nothing, leave as is.

1) Orders could be placed at market open but will not be filled until volume shows up on subsequent bars.
2) Orders could be placed at market open but will not be filled until new bars with volume show up.
3) Orders cannot be place until new bars show up with volume.

1) Yesterday's close may not be representative of today's open. Forward filled data may be misleading to price focused metrics.
2) Considerable shift in framework code mechanisms?
3) None.

As the results are all identical for each option, that is, without volume no market model will allow an order to be filled, (or should not allow any order be filled), then what alternative is appropriate here? Market models (and that's what this is) are all best guess scenarios. They're not reality and will never be reality. Even if you have the tick stream to playback through a real time simulation system you can never simulate time (millisecond delays, network throughput, etc.) or what could have happened had your order been submitted at the open and the impact it may have had (ghost volume). Then again, I probably misinterpreted the topic here and there's an obvious alternative I've skipped.

When working on this fundamental feature. I have a few suggest features/updates to keep in mind.
1. Fields like market_cap/enterprise_value etc, that are affected by the stock price each day should be auotmaticaly calculated and updated from the previous days close price. OR atleast have a simple way to calculate AND filter based on. This is a big drawback today. these fields can be upto a quarter behind now
2. There are some fields that are quarterly data, such as ebitda. It would be useful to allow for simple way to query for not just the existing fundamental fields, but also to allow querying of calculate fields fields, where the calculated fields could be based on X amount of trailing historical fields as well as price data or price history.
3. Alow us to access to historical fundamental data as far back as possible. I understand yall might have limited daily price data. But Morningstart does have historical fundamentals much farther back and it can still be useful to get and research on that data.
4. I've been told EOD data going farther back is not on the roadmap. But it would be useful if we could bring our own EOD data from outside to work with a back test. specially with fundamental analsys, just 15 years could be too small window to work with. So even if yall cannot provide that data, if we can bring our own, atleast for backtesting with the hopefully longer time frame morningstar fundamental history would be great


(Sub thread market model)

Josh & Market Tech,

The market model for a thinly traded security is this, I think:

  • Once the security is included in the algo's universe (e.g. via update_universe()) , the backtester starts looking for trade data.
  • At some point, as handle_data is called repeatedly, a trade pops up in the historical database and is made available to the backtester.
  • Once the first trade is encountered, subsequent bars are forward-filled, including the datetime stamp (see Fawce's Oct 15, 2013 posting on
  • At and beyond the first trade, the backtester will support orders (regardless of whether the data for the bar are real or forward-filled).
  • Orders are filled only when there is actual trade data for a bar; it will skip over forward-filled bars (for which there was zero historical volume).
  • The slippage model is superimposed on the inherent slippage due to the order filling based on actual trade data only.

So, the problem, I suspect, is that get_fundamentals() and update_universe() are working properly, but the backtester has a minor shortcoming. Say that security XYZ gets issued and opens for trading, but the first trade doesn't occur until 11 am. If my algo tries to submit an order for XYZ prior to 11 am, then an error results, but historically, if I'd submitted an order prior to 11 am, it would have been accepted by the broker, but perhaps not filled.

Presumably this problem propagates into paper and real-money trading, which is kinda ugly. It would say that security XYZ could open for trading at the bell, but the algo would not be able to submit an order for XYZ until trades start coming in on Q's data feed. In other words, using the Q platform, you'll never be the first one to buy shares in the new issue XYZ!


Hi Sarvi,

  1. When it comes to the market cap not updating daily, the data feed from Morningstar indeed provides those updates. We're not sure why the data isn't updating daily (we get daily and monthly feeds from Morningstar, so our suspicion is that the daily isn't getting used somehow or is being overwritten by the monthly). This is definitely in our queue but we haven't debugged it yet.

  2. Trailing historical time series is an active project we're working on currently. We 100% agree with you.

  3. I'll have to dig into our contract with respect to how much data we are entitled to. We map the data to our internal sid and we only have that data currently back to 2002 but I understand the use case you might have separate from sids and backtesting -- I expect this is more of a "do research" using fundamentals use case and maybe we can find a way to expose that data to you efficiently outside the bounds of how we process and format it for use in the backtester.

  4. I think the research environment is probably what you'd really want here


(Sub thread market model)

As a developer of market models I would never preclude the ability to submit an order on the existence of market data. Two separate systems. Period. As long as there is brokerage connectivity and a valid security traded through that brokerage [and the market is open for that security] any order should be accepted.

As far as market data availability is concerned, there is always a price. That price might be 1 or even 3 or 4 days old (long weekends) but it is the last known traded price and would still be good until new data arrived.

The combination of these two facts means that even stop and limit order should be valid even though the price might be dated. All orders should be valid given the two constraints above. Of course the market model would be aware and deal with the lack of simulated size (volume) but the order submission should never be blocked.

The alteration of a strategy to bend its order submission timing logic to that of a flawed market model is both an inaccurate representation of the market and a source of potential trading logic errors.

(Sub thread market model)

Hello Market Tech,

I think what I stated above is correct, but I'll have to wait for confirmation from the Quantopian staff, or tinker with it more myself. I suspect that orders will be accepted once the security starts trading, meaning once the first trade ever is posted (the first trade after the security has been issued on the open market). Once the first trade occurs, then the backtester will forward fill, and an order can be submitted any time.


I have been playing around with quantopian research. Looks great.

I have question on using my own data. I was able to down historical data(Open, Close, High, Low, Adjusted Price and Volume) for S&P 500 Index from yahoo going back to 1950.
I was able to upoad it to quantopian.
local_csv() allows me to load CSV files in general.

But will I be able to use this larger historical data to do back testing back to 1950 on S&P 500 ? If so how do I do this. A sample would be helpful

This combined with the hopeful availability of morngingstar fundamental data going back as far as possible, would be awesome develop fundamental/value driven models.


Quantopian staff,

When you get the chance, I'd appreciate feedback on my post above:

"(Sub thread market model)
Josh & Market Tech,
The market model for a thinly traded security is this, I think:
--Once the security is included in the algo's universe..."

Is it correct? Did I miss anything?



When you get the chance, I'd appreciate a look at my question above. Is it correct that no trade can be executed against a given security until after the first historical trade shows up in data? --Grant

Grant, I think you nailed it with this particular sentence: "is my algo getting tripped up because the forward filling hasn't kicked in yet? Is the 'if stock in data' checking for empty bars, or is it just holding off ordering until at least one trade exists, after which there would be data in every bar due to the forward filling?"

The forward filling doesn't kick in until a stock has been included in your universe. So if there is no pricing data on the backtest's first bar, forward filling won't start until there is a bar with data.

Thanks Josh,

Well, I would consider this a kind of bug, since if the backtester knows that a given stock is open for trading, it should accept an order, but not fill it until the first trade (with slippage, if applicable). Presumably, this propagates to real money live trading, right? Which would say that when a stock first goes on the market, there's no way to be the first to buy it using Quantopian.

Or am I missing something?


Hi Josh,

Just following up. What's the story with initial issues and live trading? Is it correct that one can't be the first to buy the stock using Quantopian's platform? If so, could this problem be fixed?


Any update on when live trading with a brokerage account will support fundamental data?

Are there also plan to add support of Fundamentals Data to Zipline?

Hi Matthew, good news -- fundamentals has been available for trading on IB for a few months now. Give it a shot.

Constantino, not quite sure what you're asking. The data itself is licensed from Morningstar and as such not ours to open source. The use case we've focused upon is one of using fundamentals for screening stocks for setting a universe. That is outside the scope of zipline, as best I understand it.

Hi Josh,

I mean adding to Zipline the same API as Quantopian Algorithms (get_fundamentals(), etc.. ) and then the possibility to connect it with your own datasource of fundamentals.

Thanks for the clarification. Unfortunately, explicitly bring fundamentals into zipline is not in our short term plans. We're doing a lot of work in the area of getting more data into algorithms and there might be opportunity there to make these kinds of things easier in the long term. We'll keep it in mind.


hi, Josh

is the morningstar moat score (non, narrow and wide) available in the fundamental database? thanks.

Doesn't look like it, no.

There are other derived metrics though, documented here:


Hi all - I am trying to get access to historical fundamentals data so my algorithm can evaluate changes in metrics and make trades based on these movements. Has anyone figured out some good workarounds to do this as the history function hasn't been added?


Hi Stefan,
We're working on a feature that will hopefully cover your needs here in the IDE.

The research environment has a version of get_fundamentals() that might suit your purposes there.

For backtesting, some folks build up a history by warming up the algo over the course of a time period, without trading. Here's an algo where that is done: