Austin, TX
Experience: Advanced
Platform: TradeStation
Trading: Futures
Posts: 839 since Mar 2011
Thanks Given: 124
Thanks Received: 704
|
One of the webinars on futures.io (formerly BMT) featured a gentlemen who made a statement I thought was pretty profound and I've searched but cannot find any "public" information on the subject.
He basically stated that the accuracy of an indicator reduces exponentially as you reduce the timeframe.
On the surface, it seems to make sense as the closer to the tick level you zoom, the market becomes more and more randomized from any given point.
I've done my own studies on randomization and I can prove statistically, that as you approch the tick/tick level, the likelihood that the market will move x ticks in either direction becomes essentially randomized. At any given single and exclusive point in a liquid market, the chance that price will move 10 ticks down before it will 10 ticks up is essentially 50% when examining large sample sizes.
As you move further out in timeframe, the market(s) begin to reveal more structure. The same is true as you begin to move further out with the movement variable (x in my example).
I have not done any research however on whether or not typical (or atypical) indicators are more or less accurate depending on the timeframe.
If anyone has any leads on where I might find similar information or studies or how we might confirm/deny this notion, it would be helpful.
Thanks
"A dumb man never learns. A smart man learns from his own failure and success. But a wise man learns from the failure and success of others." |
|