Back to Community
Long/Short algo simulation error - Out of memory

Dear Experts,

I was simulating long/short algo using fundamental data and halfway , it stopped and showed out of memory.
I tried to use "del" to remove the variables but still doesnt work.

I thought at every re-balancing time, the program will re-use the variables and functions again by overwriting it.

Are there other ways to clean up the memory each re-balance period?


5 responses

When I used the following code to delete all variables at the end of "rebalance" function, I have this error, "Cannot use built-in function 'globals'"

for name in dir():  
        if not name.startswith('_'):  
            del globals()[name]  

Quantopian: Are you able to allow the access of this "global"

Any other idea?

It's probably the transaction logs which are taking up all the memory, not the variables you've stored.

is there a way to remove the transaction logs? The turnover falls within Quantopian contest criteria. Average holdings about 500.

One thing that can help a bit is to comment out any pipeline content in 'columns' if not used later. Often they are just used to arrive at alpha. I keep column content using pipeline overview during work until another time when I need to actually examine the numbers with my eyes.

How long generally before a backtest goes splat against the wall?

Earlier in a post I had commented speculating that Q keeping all orders for the duration of the backtest might be significant but I no longer think so. A 10 year backtest rebalancing 1000 stocks per day with all of those order objects as a string became only 50MB on the last day. I think backtesting is currently 8GB RAM.

I think I hit the memory overflow after 7th year. So I have to break my simulation every 6 years.