public abstract class NoParamLayer extends Layer
Layer.Builder<T extends Layer.Builder<T>>constraints, iDropout, layerName| Modifier | Constructor and Description |
|---|---|
protected |
NoParamLayer(Layer.Builder builder) |
| Modifier and Type | Method and Description |
|---|---|
GradientNormalization |
getGradientNormalization() |
double |
getGradientNormalizationThreshold() |
List<Regularization> |
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.
|
ParamInitializer |
initializer() |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
clone, getMemoryReport, getOutputType, getPreProcessorForInputType, getUpdaterByParam, initializeConstraints, instantiate, resetLayerDefaultConfig, setDataTypeequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetLayerNameprotected NoParamLayer(Layer.Builder builder)
public ParamInitializer initializer()
initializer in class Layerpublic void setNIn(InputType inputType, boolean override)
Layerpublic List<Regularization> getRegularizationByParam(String paramName)
LayergetRegularizationByParam in interface TrainingConfiggetRegularizationByParam in class LayerparamName - Parameter name ("W", "b" etc)public GradientNormalization getGradientNormalization()
public double getGradientNormalizationThreshold()
public boolean isPretrainParam(String paramName)
LayerisPretrainParam in interface TrainingConfigisPretrainParam in class LayerparamName - Parameter name/keyCopyright © 2021. All rights reserved.