NexusFi: Find Your Edge


Home Menu

 



SGD

SGD (Stochastic Gradient Descent) is a popular algorithm for training a wide range of models in machine learning, including (linear) support vector machine, logistic regression. It is an iterative process.

During training one will typically like to see convergence to a best point.

For efficiency it is possible to calculate the gradient after a batch and not for every single training sample.

I Kereas the SGD optimization will take the following parameters :
  • lr : learning rate
  • momentum : momentum
  • decay : decay of the learning rate over each update
  • nesterov : true/false weather to apply Nesterov momentum



© 2024 NexusFi™, s.a., All Rights Reserved.
Av Ricardo J. Alfaro, Century Tower, Panama City, Panama, Ph: +507 833-9432 (Panama and Intl), +1 888-312-3001 (USA and Canada)
All information is for educational use only and is not investment advice. There is a substantial risk of loss in trading commodity futures, stocks, options and foreign exchange products. Past performance is not indicative of future results.
About Us - Contact Us - Site Rules, Acceptable Use, and Terms and Conditions - Privacy Policy - Downloads - Top