public static class SelfAttentionLayer.Builder extends SameDiffLayer.Builder<SelfAttentionLayer.Builder>
paramWeightInit, weightInitbiasUpdater, regularization, regularizationBias, updaterallParamConstraints, biasConstraints, iDropout, layerName, weightConstraints| Constructor and Description |
|---|
Builder() |
| Modifier and Type | Method and Description |
|---|---|
SelfAttentionLayer |
build() |
SelfAttentionLayer.Builder |
headSize(int headSize)
Size of attention heads
|
SelfAttentionLayer.Builder |
nHeads(int nHeads)
Number of Attention Heads
|
SelfAttentionLayer.Builder |
nIn(int nIn) |
SelfAttentionLayer.Builder |
nOut(int nOut) |
SelfAttentionLayer.Builder |
projectInput(boolean projectInput)
Project input before applying attention or not.
|
weightInit, weightInitbiasUpdater, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBiasconstrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, namepublic SelfAttentionLayer.Builder nIn(int nIn)
nIn - Number of inputs to the layer (input size)public SelfAttentionLayer.Builder nOut(int nOut)
nOut - Number of outputs (output size)public SelfAttentionLayer.Builder nHeads(int nHeads)
public SelfAttentionLayer.Builder headSize(int headSize)
public SelfAttentionLayer.Builder projectInput(boolean projectInput)
public SelfAttentionLayer build()
build in class Layer.Builder<SelfAttentionLayer.Builder>Copyright © 2021. All rights reserved.