package mkldnn

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. All

Type Members

  1. class AvgPooling extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
  2. class CAddTable extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with MklInt8Convertible
  3. class ConcatTable extends DynamicContainer[Activity, Activity, Float] with MklDnnContainer with MklInt8Convertible
  4. class DnnGraph extends Graph[Float] with MklDnnLayer with MklInt8Convertible
  5. class Dropout extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
  6. case class HeapData(_shape: Array[Int], _layout: Int, _dataType: Int = DataType.F32) extends MemoryData with Product with Serializable
  7. class Identity extends AbstractModule[Activity, Activity, Float] with MklDnnLayer

    Identity just return the input to output.

    Identity just return the input to output. It's useful in same parallel container to get an origin input.

  8. class Input extends ReorderMemory
  9. class JoinTable extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
  10. class LRN extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
  11. class Linear extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Initializable with MklInt8Convertible
  12. class MaxPooling extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
  13. sealed trait MemoryData extends Serializable
  14. trait MklDnnContainer extends DynamicContainer[Activity, Activity, Float] with MklDnnModule

    Helper utilities when integrating containers with MKL-DNN

  15. trait MklDnnLayer extends AbstractModule[Activity, Activity, Float] with MklDnnModule
  16. trait MklDnnModule extends MklDnnModuleHelper

    Helper utilities when integrating Module with MKL-DNN

  17. trait MklDnnModuleHelper extends MemoryOwner
  18. abstract class MklDnnNativeMemory extends Releasable
  19. class MklDnnRuntime extends AnyRef
  20. class MklMemoryAttr extends MklDnnNativeMemory
  21. class MklMemoryDescInit extends MklDnnNativeMemory
  22. class MklMemoryPostOps extends MklDnnNativeMemory
  23. class MklMemoryPrimitive extends MklDnnNativeMemory
  24. class MklMemoryPrimitiveDesc extends MklDnnNativeMemory
  25. case class NativeData(_shape: Array[Int], _layout: Int, _dataType: Int = DataType.F32) extends MemoryData with Product with Serializable
  26. class Output extends AbstractModule[Activity, Activity, Float] with MklDnnLayer

    Convert output to user defined layout and appoint gradOutput layout

  27. sealed class Phase extends AnyRef
  28. class RNN extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Initializable

  29. class ReLU extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with MklInt8Convertible
  30. class ReorderMemory extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Releasable
  31. case class ResNet50PerfParams(batchSize: Int = 16, iteration: Int = 50, training: Boolean = true, model: String = "vgg16") extends Product with Serializable
  32. class SelectTable extends AbstractModule[Activity, Activity, Float] with MklDnnLayer

    Creates a module that takes a table as input and outputs the element at index index (positive or negative).

    Creates a module that takes a table as input and outputs the element at index index (positive or negative). This can be either a table or a Tensor. The gradients of the non-index elements are zeroed Tensors of the same size. This is true regardless of the depth of the encapsulated Tensor as the function used internally to do so is recursive.

    Annotations
    @SerialVersionUID()
  33. class Sequential extends DynamicContainer[Activity, Activity, Float] with MklDnnContainer with MklInt8Convertible
  34. class SoftMax extends AbstractModule[Activity, Activity, Float] with MklDnnLayer
  35. class SpatialBatchNormalization extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Initializable with MklInt8Convertible
  36. class SpatialConvolution extends AbstractModule[Activity, Activity, Float] with MklDnnLayer with Initializable with Serializable with MklInt8Convertible

    Applies a 2D convolution over an input image composed of several input planes.

    Applies a 2D convolution over an input image composed of several input planes. The input tensor in forward(input) is expected to be a 3D tensor (nInputPlane x height x width).

    nInputPlane The number of expected input planes in the image given into forward() nOutputPlane: The number of output planes the convolution layer will produce. kernelW: the kernel width of the convolution kernelH: The kernel height of the convolution strideW: Int = 1, The step of the convolution in the width dimension. strideH: Int = 1, The step of the convolution in the height dimension padW: Int = 0, The additional zeros added per width to the input planes. padH: Int = 0, The additional zeros added per height to the input planes. nGroup: Int = 1, Kernel group number propagateBack: Boolean = true, propagate gradient back wRegularizer: Regularizer[Float] = null, bRegularizer: Regularizer[Float] = null, initWeight: Tensor[Float] = null, initBias: Tensor[Float] = null, initGradWeight: Tensor[Float] = null, initGradBias: Tensor[Float] = null, withBias: Boolean = true, format: DataFormat = DataFormat.NCHW, dilationW: Int = 1, dilationH: Int = 1

    When padW and padH are both -1, we use a padding algorithm similar to the "SAME" padding of tensorflow. That is

    outHeight = Math.ceil(inHeight.toFloat/strideH.toFloat) outWidth = Math.ceil(inWidth.toFloat/strideW.toFloat)

    padAlongHeight = Math.max(0, (outHeight - 1) * strideH + kernelH - inHeight) padAlongWidth = Math.max(0, (outWidth - 1) * strideW + kernelW - inWidth)

    padTop = padAlongHeight / 2 padLeft = padAlongWidth / 2

Value Members

  1. object AvgPooling extends Serializable
  2. object CAddTable extends Serializable
  3. object ConcatTable extends Serializable
  4. object Convolution
  5. object DnnGraph extends Serializable
  6. object Dropout extends Serializable
  7. object Identity extends Serializable
  8. object Input extends Serializable
  9. object JoinTable extends Serializable
  10. object LRN extends Serializable
  11. object Linear extends Serializable
  12. object MaxPooling extends Serializable
  13. object MklDnnMemory
  14. object Output extends Serializable
  15. object Perf
  16. object Phase
  17. object RNN extends Serializable
  18. object ReLU extends Serializable
  19. object ReorderMemory extends Serializable
  20. object ResNet
  21. object SbnDnn
  22. object Scale
  23. object SelectTable extends Serializable
  24. object Sequential extends Serializable
  25. object SoftMax extends Serializable
  26. object SpatialBatchNormalization extends Serializable
  27. object SpatialConvolution extends Serializable

Ungrouped