SGD Class
SGD is an optimisation technique. It is an alternative to Standard Gradient Descent and other approaches like batch training or BFGS. It still leads to fast convergence, with some advantages:
- Doesn't require storing all training data in memory (good for large training sets)
- Allows adding new data in an "online" setting
Namespace:
SiaNet.Model.OptimizersAssembly: SiaNet (in SiaNet.dll)
Syntax
public class SGD : BaseOptimizer
Public Class SGD _ Inherits BaseOptimizer
public ref class SGD : public BaseOptimizer
Inheritance Hierarchy
SiaNet.Model.Optimizers..::..BaseOptimizer
SiaNet.Model.Optimizers..::..SGD