Welcome to NexusFi: the best trading community on the planet, with over 150,000 members Sign Up Now for Free
Genuine reviews from real traders, not fake reviews from stealth vendors
Quality education from leading professional traders
We are a friendly, helpful, and positive community
We do not tolerate rude behavior, trolling, or vendors advertising in posts
We are here to help, just let us know what you need
You'll need to register in order to view the content of the threads and start contributing to our community. It's free for basic access, or support us by becoming an Elite Member -- see if you qualify for a discount below.
-- Big Mike, Site Administrator
(If you already have an account, login at the top of the page)
The database includes data on the following instruments:
— all currencies traded on CME: 6A,B,C,E,S,J
— indices: ES, NQ, YM, NKD, TK
— energy: CL
— metals: GS, SI, PL, HG
— goods: ZC, ZS, ZW, ZL
— bonds: ZN, ZB
— spread instruments, for example ZWH4-ZWK4
Data period from December 2013 to mid-2014. There are gaps on some instruments, but for research it is not critical. These include all the changes of limit orders in the DOM (Level 2) and all market orders.
The data format is as follows:
1) the name of the archive corresponds to the Ticker name
2) inside the archive contains a folder with the Ticker name
3) inside the folder contains files of format *.txt, the name of each file corresponds to a specific date (DD-MM-YYYY)
4) each file contains strings of a particular format, for example look at a few strings on the instrument CL (crude oil):
"A;17:57:22;11280;10090;10;6;1;"
A — changing the limit on the Ask side;
17:57:22 — time of the event;
11280 — microseconds;
10090 — the price at which this event occurred;
10 — current value limits;
6 and 1 internal datafeed flags
"B;17:57:22;12749;10087;19;16;1;"
B — change the limit on the side bids;
17:57:22 — time of the event;
12749 — microseconds;
10087 — the price at which this event occurred;
19 — current value limits;
16 and 1 internal datafeed flags
"T;17:57:24;9046;10087;1;S;"
T - trade;
17:57:24 — time of the event;
9046 — microseconds;
10087 — the price at which this event occurred;
1 is the current volume of trade;
S — side of the aggressor, in this case it was the sale
As you can see all the values in the file are separated by semicolons, thus these files can be easily downloaded into an Excel spreadsheet by choosing file format *.csv
I keep getting requests for people to share their GomRecorder data. A few threads have been started on the subject but not many are following thru it would seem.
I am collecting data from IQFeed via QCollector. All data contains bid/ask and is tick …
No it's not a duplicate. The data I collected much more detailed. They include fully all changes in the DOM (not only the best bid/ask, but full level 2) and all trades, in the order in which they occurred on the exchange. Additionally, my data includes microseconds and the number of orders on a limit level. For example:
"A;17:57:22;11280;10090;10;6;1;"
A — changing the limit on the Ask side;
17:57:22 — time of the event;
11280 — microseconds;
10090 — the price at which this event occurred;
10 — current size of all limit orders at the current price level;
6 — current number of limit orders at the currnt price level;
1 — internal flag, which shows that it is a new limit order or old limit order that was moved from one price level to another
I took a look at the data. Pretty good work, @kirillxskynet, a shame that no one else really appreciates. To add:
(1) Your file format is better than what's on the other thread. There's some weird data on the files in the other thread, e.g. if you want a running sum of the 4th column in the 5th column you can just generate it in 1 line like this:
where filename.txt is your stored data. Add a few more lines of Perl if you want to reset the counter every time it hits 6 PM. It's a waste of space to store this.
(2) There's a lot more data on @kirillxskynet's files. I was lazy to find the 2014 files on the collector thread but there are about 3 million rows on @kirillxskynet's 6AU4 May files as compared to 1k~900k rows on the latest 6A.