AdaGrad Class
Adagrad is an algorithm for gradient-based optimization that does just this: It adapts the learning rate to the parameters, performing larger updates for infrequent and smaller updates for frequent parameters
Namespace:
SiaNet.Model.OptimizersAssembly: SiaNet (in SiaNet.dll)
Syntax
public class AdaGrad : BaseOptimizer
Public Class AdaGrad _ Inherits BaseOptimizer
public ref class AdaGrad : public BaseOptimizer
Inheritance Hierarchy
SiaNet.Model.Optimizers..::..BaseOptimizer
SiaNet.Model.Optimizers..::..AdaGrad