public static class GravesBidirectionalLSTM.Builder extends BaseRecurrentLayer.Builder<GravesBidirectionalLSTM.Builder>
| Modifier and Type | Field and Description |
|---|---|
protected boolean |
helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed?
If set to false, an exception in CuDNN will be propagated back to the user.
|
inputWeightConstraints, recurrentConstraints, rnnDataFormat, weightInitFnRecurrentnIn, nOutactivationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iupdater, regularization, regularizationBias, weightInitFn, weightNoiseallParamConstraints, biasConstraints, iDropout, layerName, weightConstraints| Constructor and Description |
|---|
Builder() |
| Modifier and Type | Method and Description |
|---|---|
GravesBidirectionalLSTM |
build() |
GravesBidirectionalLSTM.Builder |
forgetGateBiasInit(double biasInit)
Set forget gate bias initalizations.
|
GravesBidirectionalLSTM.Builder |
gateActivationFunction(Activation gateActivationFn)
Activation function for the LSTM gates.
|
GravesBidirectionalLSTM.Builder |
gateActivationFunction(IActivation gateActivationFn)
Activation function for the LSTM gates.
|
GravesBidirectionalLSTM.Builder |
gateActivationFunction(String gateActivationFn)
Activation function for the LSTM gates.
|
GravesBidirectionalLSTM.Builder |
helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed?
If set to false, an exception in the helper will be propagated back to the user.
|
constrainInputWeights, constrainRecurrent, dataFormat, weightInitRecurrent, weightInitRecurrent, weightInitRecurrentnIn, nIn, nOut, nOut, unitsactivation, activation, biasInit, biasUpdater, dist, gainInit, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBias, weightInit, weightInit, weightInit, weightNoiseconstrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, nameprotected boolean helperAllowFallback
public GravesBidirectionalLSTM.Builder forgetGateBiasInit(double biasInit)
public GravesBidirectionalLSTM.Builder gateActivationFunction(String gateActivationFn)
gateActivationFn - Activation function for the LSTM gatespublic GravesBidirectionalLSTM.Builder gateActivationFunction(Activation gateActivationFn)
gateActivationFn - Activation function for the LSTM gatespublic GravesBidirectionalLSTM.Builder gateActivationFunction(IActivation gateActivationFn)
gateActivationFn - Activation function for the LSTM gatespublic GravesBidirectionalLSTM.Builder helperAllowFallback(boolean allowFallback)
allowFallback - Whether fallback to non-helper implementation should be usedpublic GravesBidirectionalLSTM build()
build in class Layer.Builder<GravesBidirectionalLSTM.Builder>Copyright © 2021. All rights reserved.