Adam Class
Adaptive Moment Estimation (Adam) is another method that computes adaptive learning rates for each parameter. In addition to storing an exponentially decaying average of past squared gradients vtvt like Adadelta and RMSprop, Adam also keeps an exponentially decaying average of past gradients mtmt, similar to momentum.
Namespace:
SiaNet.Model.OptimizersAssembly: SiaNet (in SiaNet.dll)
Syntax
public class Adam : BaseOptimizer
Public Class Adam _ Inherits BaseOptimizer
public ref class Adam : public BaseOptimizer
Inheritance Hierarchy
SiaNet.Model.Optimizers..::..BaseOptimizer
SiaNet.Model.Optimizers..::..Adam