package mkldnn
- Alphabetic
- Public
- All
Type Members
- class AvgPooling extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
- class CAddTable extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with MklInt8Convertible
- class ConcatTable extends DynamicContainer[Activity, Activity, Float] with MklDnnContainer with MklInt8Convertible
- class DnnGraph extends Graph[Float] with MklDnnLayer with MklInt8Convertible
- class Dropout extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
- case class HeapData(_shape: Array[Int], _layout: Int, _dataType: Int = DataType.F32) extends MemoryData with Product with Serializable
-
class
Identity extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
Identity just return the input to output.
Identity just return the input to output. It's useful in same parallel container to get an origin input.
- class Input extends ReorderMemory
- class JoinTable extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
- class LRN extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
- class Linear extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Initializable with MklInt8Convertible
- class MaxPooling extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
- sealed trait MemoryData extends Serializable
-
trait
MklDnnContainer extends DynamicContainer[Activity, Activity, Float] with MklDnnModule
Helper utilities when integrating containers with MKL-DNN
- trait MklDnnLayer extends AbstractModule[Activity, Activity, Float] with MklDnnModule
-
trait
MklDnnModule extends MklDnnModuleHelper
Helper utilities when integrating Module with MKL-DNN
- trait MklDnnModuleHelper extends MemoryOwner
- abstract class MklDnnNativeMemory extends Releasable
- class MklDnnRuntime extends AnyRef
- class MklMemoryAttr extends MklDnnNativeMemory
- class MklMemoryDescInit extends MklDnnNativeMemory
- class MklMemoryPostOps extends MklDnnNativeMemory
- class MklMemoryPrimitive extends MklDnnNativeMemory
- class MklMemoryPrimitiveDesc extends MklDnnNativeMemory
- case class NativeData(_shape: Array[Int], _layout: Int, _dataType: Int = DataType.F32) extends MemoryData with Product with Serializable
-
class
Output extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
Convert output to user defined layout and appoint gradOutput layout
- sealed class Phase extends AnyRef
- class RNN extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Initializable
- class ReLU extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with MklInt8Convertible
- class ReorderMemory extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Releasable
- case class ResNet50PerfParams(batchSize: Int = 16, iteration: Int = 50, training: Boolean = true, model: String = "vgg16") extends Product with Serializable
-
class
SelectTable extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
Creates a module that takes a table as input and outputs the element at index
index(positive or negative).Creates a module that takes a table as input and outputs the element at index
index(positive or negative). This can be either a table or a Tensor. The gradients of the non-index elements are zeroed Tensors of the same size. This is true regardless of the depth of the encapsulated Tensor as the function used internally to do so is recursive.- Annotations
- @SerialVersionUID()
- class Sequential extends DynamicContainer[Activity, Activity, Float] with MklDnnContainer with MklInt8Convertible
- class SoftMax extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
- class SpatialBatchNormalization extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Initializable with MklInt8Convertible
-
class
SpatialConvolution extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Initializable with Serializable with MklInt8Convertible
Applies a 2D convolution over an input image composed of several input planes.
Applies a 2D convolution over an input image composed of several input planes. The input tensor in forward(input) is expected to be a 3D tensor (nInputPlane x height x width).
nInputPlane The number of expected input planes in the image given into forward() nOutputPlane: The number of output planes the convolution layer will produce. kernelW: the kernel width of the convolution kernelH: The kernel height of the convolution strideW: Int = 1, The step of the convolution in the width dimension. strideH: Int = 1, The step of the convolution in the height dimension padW: Int = 0, The additional zeros added per width to the input planes. padH: Int = 0, The additional zeros added per height to the input planes. nGroup: Int = 1, Kernel group number propagateBack: Boolean = true, propagate gradient back wRegularizer: Regularizer[Float] = null, bRegularizer: Regularizer[Float] = null, initWeight: Tensor[Float] = null, initBias: Tensor[Float] = null, initGradWeight: Tensor[Float] = null, initGradBias: Tensor[Float] = null, withBias: Boolean = true, format: DataFormat = DataFormat.NCHW, dilationW: Int = 1, dilationH: Int = 1
When padW and padH are both -1, we use a padding algorithm similar to the "SAME" padding of tensorflow. That is
outHeight = Math.ceil(inHeight.toFloat/strideH.toFloat) outWidth = Math.ceil(inWidth.toFloat/strideW.toFloat)
padAlongHeight = Math.max(0, (outHeight - 1) * strideH + kernelH - inHeight) padAlongWidth = Math.max(0, (outWidth - 1) * strideW + kernelW - inWidth)
padTop = padAlongHeight / 2 padLeft = padAlongWidth / 2
Value Members
- object AvgPooling extends Serializable
- object CAddTable extends Serializable
- object ConcatTable extends Serializable
- object Convolution
- object DnnGraph extends Serializable
- object Dropout extends Serializable
- object Identity extends Serializable
- object Input extends Serializable
- object JoinTable extends Serializable
- object LRN extends Serializable
- object Linear extends Serializable
- object MaxPooling extends Serializable
- object MklDnnMemory
- object Output extends Serializable
- object Perf
- object Phase
- object RNN extends Serializable
- object ReLU extends Serializable
- object ReorderMemory extends Serializable
- object ResNet
- object SbnDnn
- object Scale
- object SelectTable extends Serializable
- object Sequential extends Serializable
- object SoftMax extends Serializable
- object SpatialBatchNormalization extends Serializable
- object SpatialConvolution extends Serializable