I’m glad you are finding some success :)
I myself have also made some headway. For the sake of documenting this, as well as any input from the community - here is where I’m at.
I wanted a way to keep track of today’s trades and I want to also see yesterday’s.
Problem is, the order table uses a virtual dom and only loads a couple orders.
I’ve developed a extension which pretty much tricks the virtualDom into showing all data. After deciphering their codebase I was able to locate a few global functions which gives me access to the entire web application. With this I am able to load all the trades I’ve done. Their developers store most of their constructors in memory which make accessing script instances extremely hard. I’m considering creating service workers to proxy the applications scripts back onto itself, allowing me to essentially rewrite their frontend and take full control of Quantopians paper trading platform. I would like to move to direct websocket communication, which is pretty simple but I need to replicate Qs sid to stock name. The Json comes back without the stock symbol. There’s another function which translates it that I need to gain access to. I must admit, that table has some terrible performance, extending the amount of rows to and running all their updates on that table is painful. Once the table is loaded, I use mutation observers on the DOM to spot changes, then patch them into my persisted object, which goes back to RH for synchronization.
I’ve noticed a couple forms hidden inside. Along with some iframes, I’ll need more time to investigate but with access to their models along with watiching what chatter happens between frames, I may be able to create a more robust api on both ends.
I iterate over orders in the table and construct a object with a key that’s hashed uniquely based on the data. Now that I have the trades. I run a server which intercepts the object.
On the server side (which might just be another service worker in chrome) I do a call to robinhoods api, and get my trade history there. Pretty much synchronize the trades and make sure that Robinhood is placing the same orders as Q. If there’s a new trade in the order book that’s not in RH, I push a new order to the api.
I have noticed that some Robinhood trades are executed but they are not on Q. I’m looking at setting an exclusion list Incase I want to manually trade something. But if a limit order Q places that is later cancelled because it wasn’t filled. But filled in RH, I track that independently until it’s profitable, then sell it programmatically.
Once I get enough understanding, I should be able to actually pass data back into the python algorithm and make modifications there, such as if I deposit more money or bought a stock that I want to track with the algo. I doubt they will allow cross origin requests, but it should be fairly simple to transcode their authentication into some headers. I’ve done this before to issue local sessions from a remote host with great success.
One of your Q algos places many limit orders, most are cancelled, I do want to try and minimize the amount of queries I do to RH and Q, to avoid suspicious activity. I’m considering some persistent layer that can query endpoints less frequent. Firebase might be an option just because I can asynchronously write and read elsewhere, changing the databse would indirectly execute a RH command.
Digging further into how Q runs. I discovered some websockets and a whole bunch of code. Things that may be beneficial to us, but will not mention them publicly. Luckily, the compilers that are used to build the web app are pretty bad, there’s enough obsfucated code but the manner it’s mangled makes it easy to reverse.
I’m not a fan of this delay of 15 min. Your algo does place limit orders though, so in some cases they appear before being filled, letting me forward them to RH with no profit loss. However, I wonder if it’s possible to use Robinhood as the data source? I can get the endpoint and historical data. With that we would have up to data info. Even mess with the dates so that it appears 15 min behind. I also might have found the golden goose, but need to pick the data apart.