NexusFi: Find Your Edge


Home Menu

 





Real Time Data: MySQL or Excel?


Discussion in Platforms and Indicators

Updated
      Top Posters
    1. looks_one Fernand0 with 4 posts (2 thanks)
    2. looks_two xplorer with 2 posts (0 thanks)
    3. looks_3 iantg with 2 posts (5 thanks)
    4. looks_4 HFF Trader with 1 posts (0 thanks)
    1. trending_up 1,621 views
    2. thumb_up 7 thanks given
    3. group 5 followers
    1. forum 9 posts
    2. attach_file 0 attachments




 
Search this Thread

Real Time Data: MySQL or Excel?

  #1 (permalink)
Fernand0
Argentina
 
Posts: 19 since Apr 2018
Thanks Given: 8
Thanks Received: 3

Hi! Pretty much what title says.

I want to save data(DOM) outside NinjaTrader, because of the memory.

Can you tell me the pros and cons of each one?

I'm really inclined to MySQL, but, this is just because i've never used Spreadsheets.

Thanks for your time.

Reply With Quote

Can you help answer these questions
from other members on NexusFi?
My NT8 Volume Profile Split by Asian/Euro/Open
NinjaTrader
ZombieSqueeze
Platforms and Indicators
Request for MACD with option to use different MAs for fa …
NinjaTrader
NexusFi Journal Challenge - April 2024
Feedback and Announcements
 
Best Threads (Most Thanked)
in the last 7 days on NexusFi
Retail Trading As An Industry
58 thanks
Battlestations: Show us your trading desks!
55 thanks
NexusFi site changelog and issues/problem reporting
48 thanks
What percentage per day is possible? [Poll]
31 thanks
GFIs1 1 DAX trade per day journal
29 thanks

  #3 (permalink)
 
xplorer's Avatar
 xplorer 
London UK
Site Moderator
 
Experience: Beginner
Platform: CQG
Broker: S5
Trading: Futures
Frequency: Never
Duration: Never
Posts: 5,925 since Sep 2015
Thanks Given: 15,421
Thanks Received: 15,238



Fernand0 View Post
Hi! Pretty much what title says.

I want to save data(DOM) outside NinjaTrader, because of the memory.

Can you tell me the pros and cons of each one?

I'm really inclined to MySQL, but, this is just because i've never used Spreadsheets.

Thanks for your time.

Hi Fernand0

I don't think I have ever seen a DOM data sample - Do you have a sample you want to store that you can share?


It strikes me as something for which MySQL would be better suited - but it depends on the data resolution, your reasons to store this data as well as your longer term plans about it.

Reply With Quote
  #4 (permalink)
Fernand0
Argentina
 
Posts: 19 since Apr 2018
Thanks Given: 8
Thanks Received: 3


xplorer View Post
Hi Fernand0

I don't think I have ever seen a DOM data sample - Do you have a sample you want to store that you can share?


It strikes me as something for which MySQL would be better suited - but it depends on the data resolution, your reasons to store this data as well as your longer term plans about it.

Hey, yes..



this structure is not mine( original post), but is an example of what i want to do

Reply With Quote
The following 2 users say Thank You to Fernand0 for this post:
  #5 (permalink)
 
xplorer's Avatar
 xplorer 
London UK
Site Moderator
 
Experience: Beginner
Platform: CQG
Broker: S5
Trading: Futures
Frequency: Never
Duration: Never
Posts: 5,925 since Sep 2015
Thanks Given: 15,421
Thanks Received: 15,238


Fernand0 View Post
Hey, yes..

this structure is not mine, but is an example of what i want to do

Given the data resolution, it really depends on how you intend to store the data. If you are going to store it in single-day files, you can use Excel.

But if you intend to use Excel to store multiple days in one single file, chances are after a few weeks/months of data gathering you will exceed Excel's limit of 1,048,576 rows, which means mySQL would be more suitable.


Again, it really depends on your intentions, as well as your proficiency in both Excel and SQL.

Reply With Quote
  #6 (permalink)
Fernand0
Argentina
 
Posts: 19 since Apr 2018
Thanks Given: 8
Thanks Received: 3

1 register per change in the DOM.. release the information after a couple of minutes.. let's say.. 30 minutes

Reply With Quote
  #7 (permalink)
 iantg 
charlotte nc
 
Experience: Advanced
Platform: My Own System
Broker: Optimus
Trading: Emini (ES, YM, NQ, ect.)
Posts: 408 since Jan 2015
Thanks Given: 90
Thanks Received: 1,147

In SQL you can build views (Which are predefined reports that aggregate, calculate, and filter data in anyway you can possibly imagine). So the approach you want is to send rows over to your SQL Server on whatever frequency you like, and then when you need it to query the views back to your application. If you application is running on the same machine as your SQL Server and you have decent memory allocation, this process will be the fastest and cheapest on your memory.



I might do a post about this eventually and show how I do it....


Excel by contrast only holds 1.2 million rows, and starts to hemorrhage after around 500K rows. It can't calculation, aggregate, or filter quick at all, and there is no chance you could do any analysis and send it back to your application in any reasonable time.


Hope this helps.

Ian

In the analytical world there is no such thing as art, there is only the science you know and the science you don't know. Characterizing the science you don't know as "art" is a fools game.
Visit my NexusFi Trade Journal Reply With Quote
The following 2 users say Thank You to iantg for this post:
  #8 (permalink)
Fernand0
Argentina
 
Posts: 19 since Apr 2018
Thanks Given: 8
Thanks Received: 3

@iantg

Thanks a lot. But you did it with Spreadsheets? or that image was just to show an example?

Reply With Quote
  #9 (permalink)
 iantg 
charlotte nc
 
Experience: Advanced
Platform: My Own System
Broker: Optimus
Trading: Emini (ES, YM, NQ, ect.)
Posts: 408 since Jan 2015
Thanks Given: 90
Thanks Received: 1,147

Hi Fernand0,

There are two different use case.

1. To do the initial analysis (Not in real time to optimize for latency or execution speed) I just do a print to the output window of ninjatrader and copy batches of 100K records at a time into excel. The raw data is about 3x the size of the final analyzed dataset. So for one day in the ES you will be around 150k to 200k raw rows. I typically keep a spreadhseet for each day, where I compute my various bets and test different things. In case it's not obvious, I play in the HFT space.... So I don't need months and years of history at a macro level to test my bets. I need weeks up to at most a few months of very very detailed information to validate my current betting models. This might be a different approach from the needs of some traders.

2. Now in a production environment, optimizing for real time data feeds and execution speed you would need a different approach to work with this much data. If the goal is to synthesize 10k to 50k rows of data into your decision engine in real time to get alpha signals to trade off of, then you will certainly need to take the SQL approach. Here the idea would be to do an insert statement for each row into your database table, and then as needed fetch various views in SQL to determine your analytics.

So if you are just getting started and want to do the analysis, then just printing the output to an output window and then copying it straight into excel will work fine during the initial analysis phase.... But for real time trading, there is no way to move between excel and your application fast enough to have any value, you will have to go with SQL for this step.

There has been interest in some of these topics before from others, and I have been saying I would do it for a while now.... So maybe I will start a microstructure thread and share how to do some of this data modeling, analysis, betting logic, etc.

Ian



Fernand0 View Post
@iantg

Thanks a lot. But you did it with Spreadsheets? or that image was just to show an example?


In the analytical world there is no such thing as art, there is only the science you know and the science you don't know. Characterizing the science you don't know as "art" is a fools game.
Visit my NexusFi Trade Journal Reply With Quote
The following 3 users say Thank You to iantg for this post:
  #10 (permalink)
HFF Trader
Minneapolis Minnesota
 
Posts: 14 since Apr 2014
Thanks Given: 8
Thanks Received: 3


If you have any programming experience I would suggest storing raw data in flat binary files with a fixed structure (one file per instrument per day). ES alone has 2-3 million level1 (best bid/ask/trades) updates per day and > 4 million level2 (10 levels of order book on each side). It's way too much data even for SQL. But as other posters said, it depends on whether you want to store the whole thing.

Reply With Quote





Last Updated on August 2, 2018


© 2024 NexusFi™, s.a., All Rights Reserved.
Av Ricardo J. Alfaro, Century Tower, Panama City, Panama, Ph: +507 833-9432 (Panama and Intl), +1 888-312-3001 (USA and Canada)
All information is for educational use only and is not investment advice. There is a substantial risk of loss in trading commodity futures, stocks, options and foreign exchange products. Past performance is not indicative of future results.
About Us - Contact Us - Site Rules, Acceptable Use, and Terms and Conditions - Privacy Policy - Downloads - Top
no new posts