Welcome to NexusFi: the best trading community on the planet, with over 150,000 members Sign Up Now for Free
Genuine reviews from real traders, not fake reviews from stealth vendors
Quality education from leading professional traders
We are a friendly, helpful, and positive community
We do not tolerate rude behavior, trolling, or vendors advertising in posts
We are here to help, just let us know what you need
You'll need to register in order to view the content of the threads and start contributing to our community. It's free for basic access, or support us by becoming an Elite Member -- see if you qualify for a discount below.
-- Big Mike, Site Administrator
(If you already have an account, login at the top of the page)
Record a differentiated feed every [min], with Sierra chart?
I have read the threads about data feed's stream recordings (from eSignal, IQfeed, ...), with the impressive work done, through the "GomiRecorder" .
I 've some questions about GomiRecorder and Sierra Charts:
- broadly: is the GomiRecorder code written in C#, a good starting point to imagine a C\C++ adaptation \ portage towards a basic data engine that could export data every [min] from a chart\sudy?
- and more precisely: as the GomiRecorder framework uses a polymorphic way, it can export theorically several types of file exports, AFAIK for Ninja at least: text probably for [daily], [binary] probably for intraday, specialized with the distribution loop of Ninja events, ... So, is the idea of restarting from the GomiRecorder skeleton, a good idea to get an export of an alive data feed displayed in a SC's chart\Study (using the ASCIL API, and the "emulated" events - OnBarUpdate, OnMarketData, OnBarUpdateDone - of the GomiRecorder's Classes)?
I know: it's a very technical computer question, and too open. But, I'm just a novice who is trying to start his learning curve with SC and ASCIL .
Any lights will be appreciated.
Regards.
Can you help answer these questions from other members on NexusFi?
There are already native SC studies to export data from charts in SC, both historical and live updating. As well as the data that SC stores is easily read outside of SC.
And then there are also native ways to add a data feed SC through the DTC protcol server.
Thank you for the guidance. I've tested "Write Bar Data to File" or "Spreadsheet Study": it works perfectly.
It does exactly what I want... but it apparently only exports data to a one and the same file that grows. I couldn't find a parameter to create _differentiated_ files (e.g. export a timeframe [1mn], and change the output file every [5mn]): the idea behind, is to re-import the exported data into a database. Nevertheless, accessing this single file - in share mode - is a little trickier \ complicated than accessing to distinct and differentiated files that have been created every x [KB].
After having read the documentation concerning the DTC-Protocol documentation diagonally, I find that it's an overkill for me. It seems to be a way to install several instances of SC on the same computer. For example, with a physical server composed of 4 CPUs, all multi-cores, it seems recommended to install a dedicated SC on one CPU that will be the DTCP server for the other SC's instances using the other CPUs, i.e. relaying the downloaded data. Afterwards, as there is all the documentation that describes this protocol, the development of a client that exports the messages carrying market data is indeed possible.
Now, at my level (i.e. with a huge lack of knowledge of SC possibilities), I always stay on the solution's scenario, which is to enter in the ASCIL (API documentation), and to understand where I can hook the data distribution loop and its storage in array, sub-arrays, etc, to imagine how to export them at each x [KB] pieces into a new file.
==> Unless there are known links to source codes for Studies, using this DTC-Protocol...?
I know. In fact, rather than a continuous export in the same file, I would have liked the export to create a new file every minute. If I look at the Study "Write Bar Data to File" properties, it would have been perfect for me if there had been a property named "Split exported file at x minute(s)", for example.
WriteBarInFile
Why? The creation of a file per minte rather than adding a line per minute in the same file, is amo, more simple to detect on the hard disk, by a external observer program.
It's a pretty simple ACSIL scrip to do that. I think I have already done a job like that that saves the last X number of bars. Easy to give it a time stamp name so it's an new file each time. Will have a look tomorrow.
BUT why?
Thats the worst way to transfer data. Especially when there is the DTC server to stream data on the wire. Then also the data is readily accessible already in binary format in the .scid files. And then lastly you can just write your own scrip in C++ to send anything you want straight to another program again either on the wire or as an array with a linked dll.
So back to my question what's the ultimate aim?what are you doing with the data that you need it outside of SC in a txt file. The quick solution in these cases just builds fragile and non performant solutions.
Well, reading this, I realize that I have a general culture in SC, which clearly sucks. This is obvious! Not having read the SC technical documentation, it is certain that my question is too fuzzy.
So, I'm going to do my homework (read the documentation; I really like SC's pholosophy), before continuing with another question.
To rephrase my original question differently: I have SC producing a resource (a cumulative file).
On the other hand, I'm trying to see how to consume this resource with an external worker-thread, in order to store it in a database.
I find it should be easier to find out how to get the resource, from the outside, in the form of small files every minute - which I could access in exclusive mode - rather than as a single file in share access with SC.
Anyway, sincerely, thank you.
And don't bother anymore with this too nebulous question .