Back to Community
Is it possible to downsample fundamental data?

I would like to generate a factor for equities based on some fundamental data, the problem is: the calculation uses current values minus values from five years ago. When I try to write a CustomFactor to do this with a window size of 1260 days I get a MemoryError, even when I mask out all but the US500 equities I still get that error. With 500 days is about the largest value that can be used and even it is very slow.

The fundamental data does not change that often, I would be happy with four data points a year. Is it possible somehow to downsample the fundamental data so that I can get 5 years worth of data?

Thanks for the help

2 responses

Not sure, I would also be interested in the answer. Saw another post a while back that used a different method to get data from past quarters. I've also noticed that trying to go back 5 years caused crashes, and attempting to get the history for more than three fundamental metrics was also causing crashes. Quantopian is developing a better way to access fundamental data from past quarters but I haven't seen a progress update so far.

@Karl Z, thanks for the reply and showing me this notebook. I think it can be done in Research.

It would be great if we could access Pipeline in IDE/backtester the way it is accessed in Research. (Without a safeguard against look-ahead bias of course.)
For example: if I want to get the one year increase in some value pipeline needs to give me 252 data points, when I only need the first and last.

I have noticed that pipeline is really slow for multiple fundamental signals. The code for appending fundamental data has not been released, but it would not surprise me if it was doing some sort of slow-lookup for each asset.