Recurrent..::..GRU Method
Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks, introduced in 2014. Their performance on polyphonic music modeling and speech signal modeling was found to be similar to that of long short-term memory.[1] They have fewer parameters than LSTM, as they lack an output gate
Namespace:
SiaNet.NNAssembly: SiaNet (in SiaNet.dll)
Syntax
public static Function GRU( int inputDim, uint hiddenSize, uint numLayers, bool bidirectional, string weightInitializer )
Public Shared Function GRU ( _ inputDim As Integer, _ hiddenSize As UInteger, _ numLayers As UInteger, _ bidirectional As Boolean, _ weightInitializer As String _ ) As Function
public: static Function^ GRU( int^ inputDim, unsigned int^ hiddenSize, unsigned int^ numLayers, bool^ bidirectional, String^ weightInitializer )
Parameters
- inputDim
- Type: Int32
The input dimension.
- hiddenSize
- Type: UInt32
Size of the hidden layer.
- numLayers
- Type: UInt32
The number layers.
- bidirectional
- Type: Boolean
If bidirectional RNN
- weightInitializer
- Type: String
The weight initializer.