Overfitting - futures io
futures io


Overfitting is a common problem in machine learning and statistics.

It means that the model has been fit so tight to the learning sample data that it fails miserable when new previously unseen data is presented to the trained model.

When overfitting has occured also the noise is included in the model. The best way of thinking of overfitting is when the model has started to memorize the data rather than learn a model that is able to work on the data (a model that is also valid for new unseen data).

There are various steps one can take to avoid overfitting
  • cross-validation
  • regularization
  • early stopping
  • pruning
  • Bayesian priors on parameters
  • model comparison
  • dropout
  • restrict the complexity of the model

Copyright © 2021 by futures io, s.a., Av Ricardo J. Alfaro, Century Tower, Panama, Ph: +507 833-9432 (Panama and Intl), +1 888-312-3001 (USA and Canada), info@futures.io
All information is for educational use only and is not investment advice.
There is a substantial risk of loss in trading commodity futures, stocks, options and foreign exchange products. Past performance is not indicative of future results.