First, you don't have to have a lot of memory to load a large data set, because the Windows operating system would swap the data to disk if it can't all fit into memory. It will recall the data from disk only when needed.
If you are talking about loading the data alone (ie without indicators), and you would want to have everything resident on memory, then you would need at least twice that amount. The memory need goes up exponentially after that; it depends on how many variables you have, how many and how big are arrays. Because each variable requires a memory location for the entire series of the data you have loaded. (note: loaded by MultiCharts. Not what is in your database).
The following 3 users say Thank You to Bimi for this post:
That's not the way AmiBroker works, but it sounds like MC is different -- for AB, if you want to import a data set, it needs to load the whole thing in RAM. Besides, if you actually want to do anything with the data (backtest, etc.), wouldn't you need the entire thing to be resident in RAM, simply to get decent performance? I.e., if it needed to swap to a disk, even if you have an SSD, wouldn't that slow things to a crawl? [See, for instance, this discussion of disk vs. RAM vs. cache access, by the developer of AmiBroker, at the end of this link: http://www.amibroker.com/guide/h_multithreading.html]
OK, sounds like MC's memory demands are similar to AB's.
The following user says Thank You to theorist for this post:
If you are backtesting, you DO NOT need to load ALL the data into memory.
NO, it does not slow things down. Swapping is fast, especially in backtest environment where real time data is static.
I don't know how AmiBroker memory works. It is supposed to be very good with backtesting. When it comes to optimization, it is not as fast as MultiCharts, but the backtesting is supposed to have good performance.
The following 2 users say Thank You to Bimi for this post: