For three weeks, I have communicated with neural networks and genetic coding.
Have heard about this for years, but ... nothing for me. Now the circle is closed.
As a seasoned Googler, I found this AForge.NET :: Computer Vision, Artificial Intelligence, Robotics Framework, which has beautiful examples of application. What important is for the learning process. See links below.
Last week i order a book by amazon: Introduction to Neural Networks for C#, 2nd Edition (Perfect Paperback)
you can implement classes to your ninjascript or build external programms that you can connect. An other importent reason is: Aforge is open source and this is good. Free world, free sources and free money ...
Have you compared this with FANN? I have not used either and have often wanted to work with NN but not sure where to start. I use NinjaTrader and am unsure how to send prices/indicator values to a NN to get a signal back (beginner programmer). Any thoughts on how to start/which program is better to use (assuming both good programs)?
I use neuronDotNet which is similar to Aforge. I prefer the structure of it but they both have pretty much the same features.
Encog is another good project which has both a java and c# version, unfortunately the c# version is way behind currently but the java version is worth looking at to get ideas even if you don't use java. It has many useful network types and training algorithms which are not found in any of the c# oss projects that I know of.
Neural nets are a vast subject.. their inherent flexibility can be both a strength and weakness, be prepared to spend a lot of time to get anything that is truly useful.
All of the successful systems I have seen employ multiple technologies and usually multiple neural networks working as an ensemble. By multiple technologies I mean they use other decision support mechanisms such as fuzzy logic, bayesian inference, mixture models, etc. In other words, these are not simple feedforward perceptron networks trained by gradient descent (I wish it were that easy) =)
The following 4 users say Thank You to sefstrat for this post:
I have spent several months porting a GMDH polynomial neural network from VBA to NT. I do have it up and running in NT in real time. I would describe the code as almost beta.
To find out more about GMDH you can Google GMDH or visit this link here .
I am converting the Excel implementation that can be found here .
What I like about this type of neural network is that it is very fast in arriving at the network. Example: A time series with 48 cases and 24 variables can produce a network in 00:00:00.00259. This is fast enough that one could create a new network using 1 or 2-minute bars. Example: 200 cases with 5 variable forecasting 1 bar in advance can produce a network in 00:00:00.00134. Again, very doable in real time on a 1 or 2-minute chart. All of this is running totally inside NT. I am use to spending minutes if not hours in a stand alone program just to get a network. And then have to find a way to use it in real time. At least this seems to solve that problem.
Now can this actually be used to produce something that will make money? That is the $64,000 question. The easy part is is probably now done and the really hard work is about to begin.
The following 3 users say Thank You to scJohn for this post:
The problem with using GMDH in the way that you are describing (ie constantly regenerating the network structure) is that you have no control over what it is doing and no knowledge of exactly how its arriving at its solution.
It may work fantastically sometimes and then at other times it may fail miserably, you are basically at the mercy of the network. At least that is my experience with completely autonomous self-organizing networks.
There is definitely power in dynamic network structure however I think to harness it effectively you will need to use hints or templates or one of the other methods for defining a network starting point and defining how/what can change.
Another technology you might take a look at is NEAT/HyperNEAT (Neuro evolution of augmenting topologies)