Using OnMarketData() on Historical data with a recording engine - NinjaTrader Programming | futures io social day trading
futures io futures trading


Using OnMarketData() on Historical data with a recording engine
Updated: Views / Replies:28,313 / 115
Created: by gomi Attachments:23

Welcome to futures io.

(If you already have an account, login at the top of the page)

futures io is the largest futures trading community on the planet, with over 90,000 members. At futures io, our goal has always been and always will be to create a friendly, positive, forward-thinking community where members can openly share and discuss everything the world of trading has to offer. The community is one of the friendliest you will find on any subject, with members going out of their way to help others. Some of the primary differences between futures io and other trading sites revolve around the standards of our community. Those standards include a code of conduct for our members, as well as extremely high standards that govern which partners we do business with, and which products or services we recommend to our members.

At futures io, our focus is on quality education. No hype, gimmicks, or secret sauce. The truth is: trading is hard. To succeed, you need to surround yourself with the right support system, educational content, and trading mentors Ė all of which you can find on futures io, utilizing our social trading environment.

With futures io, you can find honest trading reviews on brokers, trading rooms, indicator packages, trading strategies, and much more. Our trading review process is highly moderated to ensure that only genuine users are allowed, so you donít need to worry about fake reviews.

We are fundamentally different than most other trading sites:
  • We are here to help. Just let us know what you need.
  • We work extremely hard to keep things positive in our community.
  • We do not tolerate rude behavior, trolling, or vendors advertising in posts.
  • We firmly believe in and encourage sharing. The holy grail is within you, we can help you find it.
  • We expect our members to participate and become a part of the community. Help yourself by helping others.

You'll need to register in order to view the content of the threads and start contributing to our community.  It's free and simple.

-- Big Mike, Site Administrator

Reply
 23  
 
Thread Tools Search this Thread
 

Using OnMarketData() on Historical data with a recording engine

  #91 (permalink)
Site Administrator
Manta, Ecuador
 
Futures Experience: Advanced
Platform: My own custom solution
Favorite Futures: E-mini ES S&P 500
 
Big Mike's Avatar
 
Posts: 46,238 since Jun 2009
Thanks: 29,350 given, 83,220 received


danjurgens View Post
I've started working on the DB back-end for gomRecorderIndicator. It got a little more complicated when I figured out that bringing up the DB had to be serialized, that added a lot of complexity if one wanted to keep it as an NT indicator, so I'm taking a slightly different route. I'm in the process of creating it as stand alone server app. I've got the server side up and running, although not really debugged at all. I'm working on the client side now.

The downside is I probably won't have anything I can distribute until next weekend.

Once I do that I'll start a new thread instead of continuing to hijack this one!

Looking forward to your new thread. Sooner rather than later I hope, because I already have questions

I've been trying to get a mysql backend for recording tick data for a long time, this may be exactly that. But I wanted to make it publicly accessible (futures.io (formerly BMT) Elite only). So maybe we can work together to make that happen, the general idea was a central tick repository that others can pull from.

Mike

Due to time constraints, please do not PM me if your question can be resolved or answered on the forum.

Need help?
1) Stop changing things. No new indicators, charts, or methods. Be consistent with what is in front of you first.
2) Start a journal and post to it daily with the trades you made to show your strengths and weaknesses.
3) Set goals for yourself to reach daily. Make them about how you trade, not how much money you make.
4) Accept responsibility for your actions. Stop looking elsewhere to explain away poor performance.
5) Where to start as a trader? Watch this webinar and read this thread for hundreds of questions and answers.
6)
Help using the forum? Watch this video to learn general tips on using the site.

If you want
to support our community, become an Elite Member.

Reply With Quote
 
  #92 (permalink)
Elite Member
Austin, TX
 
Futures Experience: Advanced
Platform: TradeStation, TWS
Broker/Data: TradeStation, IB .. for now
Favorite Futures: 6E, ES
 
Posts: 28 since May 2010
Thanks: 0 given, 27 received

What I have so far is a long way from something that I'd ever expose to the internet. Robustness has taken a backseat to getting something that works fairly quickly, but the underpinnings are probably there. Design problem I can foresee for something that serves data to a lot of clients are: 1. That it spawns a new thread for each client, so it might not scale well over a hundred or so concurrent clients. 2. The overly simple protocol has no way to deal with faulty requests or anything.

I opted against a relational database, and am using Berkeley DB (which is what MySQL used to use for it's internal storage). BDB stores data via key/data pairs, this works great for ticks, with a key of (symbol,time) and data of (price, bid, ask, volume), duplicate keys are allowed. For simplicity in implementation I just have 1 database right now, but it will be easy to change to have numerous internal DBs for symbols, or symbols and subsets of the time like year/month, but that'd be completely transparent to the client.

The socket interface is very simple, 3 requests read, realtime write, and back fill. Only one response from the server, containing data from a read request (1..N per request). Real time and back fill are the same except backfill first deletes all the data for that exact symbol and time stamp. Read requests take a symbol and start time stamp, it then streams all the data after that time stamp for the given symbol in blocks that each contain all the ticks for that exact time stamp.

This design pushes some buffering responsibility to a back fill writer, because it has to send all the ticks for a unique time stamp at once or it deletes it's previously sent data. Readers get N responses each one with all the data for a given time stamp.

Right now I'm working on shoehorning a client into a GomFileManager subclass so I can do some debugging and testing of the server. It takes a bit of kludging because the recordTick interface to that class only provides the price and tick type, so I have to rebuild fake ask and bid prices. I also need to understand the read back use case a little better, and that might lead to changes in the read request/response interface.

Reply With Quote
The following user says Thank You to danjurgens for this post:
 
  #93 (permalink)
Elite Member
CA
 
Futures Experience: Master
Platform: Marketdelta and Ninja
Broker/Data: Velocity
Favorite Futures: NQ
 
Posts: 670 since Apr 2010
Thanks: 64 given, 521 received


I truly think ninja should come in here and kiss the feet of gomi and other programmers that are literally fixing their buggy crap software for them for free.

Reply With Quote
 
  #94 (permalink)
Elite Member
Austin, TX
 
Futures Experience: Advanced
Platform: TradeStation, TWS
Broker/Data: TradeStation, IB .. for now
Favorite Futures: 6E, ES
 
Posts: 28 since May 2010
Thanks: 0 given, 27 received

It is quite amazing all the hoops that we have to jump through because they won't add 2 column to table!

Reply With Quote
 
  #95 (permalink)
Elite Member
San Francisco Bay Area
 
Futures Experience: Intermediate
Platform: NT,TOS,IB
Favorite Futures: ES,CL,TF
 
Posts: 278 since Jun 2010
Thanks: 154 given, 267 received


danjurgens View Post
It is quite amazing all the hoops that we have to jump through because they won't add 2 column to table!

It is worse than that.

They store the bid-ask-last tick data, but store it separately so the sequence of ticks is not preserved. I presume that is because NT can draw the charts with either of these values, and some one made the architectural decision to tie the db to the application!

They could very easily fix this so that a single tick db has all the three types of ticks with the proper sequential time-stamp with a very simple filter but they chose not too. So the tick data downloaded by NT is not useful to backfill.

HOWEVER, they also have the infrastructure for replay which DOES store and capture tick data (both L1 and L2) in the right sequence. However that infrastructure can be used only for replay and can not drive your indicators.

I presume they are simply overloaded and understaffed and different teams did their own thing without regards to the big picture. NT is only software I know off which maintains two separate mutually incompatible dbs for tick data. But they still can not offer chart backfill capability (OnMarketData) for historical values.
--------------------------------------------------------------------------------------------------

On your project: The biggest challenge I have is that you can not update your database while you are collecting the ticks. So you have to find a time-window when the ticks are not coming to fill gaps in your gomi data base.

There is a similar issue with your db also. How will you guarantee that you do not create duplicate entries when you try to update the db with historical data for those times your tick collector was not working.? One way to approach this would be to define sectors (say 5 minutes of tick data), and use that as the minimum unit which can be updated. So when you are reloading historical data, your application should send data in chunks of 5 min (or multiples of 5min) and then the db will treat that data as the golden copy and over write any existing data for those 5 minutes with the new data. On the other hand the live tick data will not check for duplicates and simply insert.

Reply With Quote
The following user says Thank You to aviat72 for this post:
 
  #96 (permalink)
Elite Member
San Francisco Bay Area
 
Futures Experience: Intermediate
Platform: NT,TOS,IB
Favorite Futures: ES,CL,TF
 
Posts: 278 since Jun 2010
Thanks: 154 given, 267 received


gomi View Post
Thanks.

Not sure this is supported by ninja : I tried on the ES and I don't get Session Break bars, ans Bars.SessionBreak never gets true.

If you want all ticks you can always set "disable tick filter" to true.

OK. Will try that. At least for the charts I am interested in, it needs the complete session.

Reply With Quote
 
  #97 (permalink)
Elite Member
Austin, TX
 
Futures Experience: Advanced
Platform: TradeStation, TWS
Broker/Data: TradeStation, IB .. for now
Favorite Futures: 6E, ES
 
Posts: 28 since May 2010
Thanks: 0 given, 27 received

There will be no issue writing backfill and real time to the DB at anytime. All updates and reads from the DB will be by unique time stamp (seconds from the market, although I see gomi has a MS option). Berkeley DB supports transactions, readers will have a consistent view of the data on a per time stamp basis as they'll lock each block while reading it to prevent a write to that data, so writers block until readers release their lock.

I have two thoughts about concurrent real time writers. Either the DB server will only accept real time writes from the first connected client to request real time writing and other requests will be dumped as No-Ops. Or it will always delete the data for a key (symbol/time stamp) before writing the new data. Plan 2 would require the writer to buffer all ticks for a time stamp before sending, that way multiple writers would just overwrite each others data until the last ones data persists, not the most efficient, but workable. Whatever route I go will all be encapsulated in the client side interface I'm exposing to GomFileManager.

The use case of this DB is frequent writes to the end with rare burst reads of large amounts data across all keys, and rare burst writes. The only place a real time writer and reader conflict is at the last time stamp, so that should keep lock blocking quite minimal. Doing a simultaneous backfill while reading could have higher contention, but should still be quite reasonable since the amount of locks needing to be acquired to write or read is small.

Reply With Quote
 
  #98 (permalink)
Elite Member
San Francisco Bay Area
 
Futures Experience: Intermediate
Platform: NT,TOS,IB
Favorite Futures: ES,CL,TF
 
Posts: 278 since Jun 2010
Thanks: 154 given, 267 received

The issue is that time stamps are not guaranteed to be unique. You can not assume that two trade ticks with exactly the same time-stamp are one and the same trade; they may correspond to different trades. Of course if the data-vendor supplies some unique tick identification mechanism you should be able to avoid the conflict. So you may end up with duplicate copies for the same tick if you have a live writer and also a backfill application which dumps historical data asynchronously.

This of course depends on the time-stamp resolution. With the 1second resolution you are virtually guaranteed to have non-unique time-stamps; even with the ms resolution, the time-stamps are not guaranteed to be unique for separate trades.

That is why I felt that backfills should over-write live data and should also be executed with some minimum chunk size.

Reply With Quote
 
  #99 (permalink)
Elite Member
Austin, TX
 
Futures Experience: Advanced
Platform: TradeStation, TWS
Broker/Data: TradeStation, IB .. for now
Favorite Futures: 6E, ES
 
Posts: 28 since May 2010
Thanks: 0 given, 27 received

BDB accepts duplicate keys and duplicate key/data pairs just fine. You just have to use a construct they call a cursor to access all the data with duplicate keys. The conflict between realtime writers and backfill writers resolves itself when you force the backfill writer to provide all the tick data for each unique time stamp at once. Then you just delete and replace all the data for each time stamp with the back filled data. It's theoretically possible you would lose some ticks if you back fill up to the current real time second, but I don't see that being possible in my initial implementation. To download back fill for the current partial second and get it processed and to the server while a real time writer is still writing that tick is race the user can avoid by not back filling up to right now, just go a second back.

Reply With Quote
 
  #100 (permalink)
Elite Member
Toronto
 
Futures Experience: Advanced
Platform: NinjaTrader
 
Posts: 108 since Sep 2009
Thanks: 30 given, 183 received



danjurgens View Post
It is quite amazing all the hoops that we have to jump through because they won't add 2 column to table!

I agree... if you take a look at how MultiCharts is dealing with their customers (setting up a site to ask for suggestions regarding their new DOM feature), you can tell that they are going to eat NinjaTrader's lunch in the not too distant future, once their feature set catches up. NT has a suggestion board too, too bad they rarely listen or do anything that their customers actually suggest... it's a lot of "my way or the highway" with them.

NT is only getting away with their ridiculously bad level of customer responsiveness and horribly delayed development timelines because there hasn't been a decently priced competitor for a while... hopefully MultiCharts can change that. I don't blame Big Mike for switching, as soon as they add some discretionary trading features I am thinking about it myself.

Reply With Quote

Reply



futures io > > > > > Using OnMarketData() on Historical data with a recording engine

Thread Tools Search this Thread
Search this Thread:

Advanced Search



Upcoming Webinars and Events (4:30PM ET unless noted)

Linda Bradford Raschke: Reading The Tape

Elite only

Adam Grimes: TBA

Elite only

NinjaTrader: TBA

January

Ran Aroussi: TBA

Elite only
     

Similar Threads
Thread Thread Starter Forum Replies Last Post
NT Providing Bad Price & Volume Data in OnMarketData!!! RJay NinjaTrader Programming 18 June 5th, 2017 10:53 AM
Historical CL Tick Data and Minute Data Big Mike The Elite Circle 112 July 26th, 2014 07:07 PM
BuySellVolumeTotalG39 Lost data recording ability JohnPS The Elite Circle 2 April 26th, 2011 12:30 AM
Historical Tick Data togier Reviews of Brokers and Data Feeds 2 December 3rd, 2010 05:09 PM
Anyone not receive pre-market data while recording with NinajTrader via IQFeed today? richw Reviews of Brokers and Data Feeds 1 May 10th, 2010 08:48 PM


All times are GMT -4. The time now is 04:14 PM.

Copyright © 2017 by futures io, s.a., Av Ricardo J. Alfaro, Century Tower, Panama, +507 833-9432, info@futures.io
All information is for educational use only and is not investment advice.
There is a substantial risk of loss in trading commodity futures, stocks, options and foreign exchange products. Past performance is not indicative of future results.
no new posts
Page generated 2017-12-12 in 0.15 seconds with 20 queries on phoenix via your IP 54.163.210.170