Packages

o

com.nvidia.spark.rapids.shims

GpuTypeShims

object GpuTypeShims

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. GpuTypeShims
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def additionalArithmeticSupportedTypes: TypeSig

    Get additional arithmetic supported types for this Shim

  5. def additionalCommonOperatorSupportedTypes: TypeSig

    Get additional common operators supported types for this Shim (filter, sample, project, alias, table scan ......

    Get additional common operators supported types for this Shim (filter, sample, project, alias, table scan ...... which GPU supports from 330)

  6. def additionalCsvSupportedTypes: TypeSig

    Get additional Csv supported types for this Shim

  7. def additionalParquetSupportedTypes: TypeSig

    Get additional Parquet supported types for this Shim

  8. def additionalPredicateSupportedTypes: TypeSig

    Get additional predicate supported types for this Shim

  9. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  10. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  11. def columnarCopy(cv: ColumnVector, b: RapidsHostColumnBuilder, dataType: DataType, rows: Int): Unit

    Copy a column for computing on GPU.

    Copy a column for computing on GPU. Better to check if the type is supported first by calling 'isColumnarCopySupportedForType'

    Data type is passed explicitly to allow overriding the reported type from the column vector. There are cases where the type reported by the column vector does not match the data. See https://github.com/apache/iceberg/issues/6116.

  12. def csvRead(cv: ColumnVector, dt: DataType): ColumnVector
  13. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  14. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  15. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  17. def getConverterForType(t: DataType, nullable: Boolean): TypeConverter

    Get the TypeConverter of the data type for this Shim Note should first calling hasConverterForType

    Get the TypeConverter of the data type for this Shim Note should first calling hasConverterForType

    t

    the data type

    nullable

    is nullable

    returns

    the row to column convert for the data type

  18. def hasConverterForType(otherType: DataType): Boolean

    If Shim supports the data type for row to column converter

    If Shim supports the data type for row to column converter

    otherType

    the data type that should be checked in the Shim

    returns

    true if Shim support the otherType, false otherwise.

  19. def hasSideEffectsIfCastFloatToTimestamp: Boolean
  20. def hasSideEffectsIfCastIntToDayTime(dt: DataType): Boolean
  21. def hasSideEffectsIfCastIntToYearMonth(ym: DataType): Boolean
  22. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  23. def isColumnarCopySupportedForType(colType: DataType): Boolean

    Whether the Shim supports columnar copy for the given type

  24. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  25. def isParquetColumnarWriterSupportedForType(colType: DataType): Boolean
  26. def isSupportedDayTimeType(dt: DataType): Boolean

    Whether the Shim supports day-time interval type for specific operator Alias, Add, Subtract, Positive...

    Whether the Shim supports day-time interval type for specific operator Alias, Add, Subtract, Positive... operators do not support day-time interval type on this Shim Note: Spark 3.2.x does support DayTimeIntervalType, this is for the GPU operators

  27. def isSupportedYearMonthType(dt: DataType): Boolean

    Whether the Shim supports year-month interval type Alias, Add, Subtract, Positive...

    Whether the Shim supports year-month interval type Alias, Add, Subtract, Positive... operators do not support year-month interval type

  28. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  29. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  30. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  31. def supportCsvRead(dt: DataType): Boolean
  32. def supportToScalarForType(t: DataType): Boolean

    Whether the Shim supports converting the given type to GPU Scalar

  33. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  34. def toRapidsOrNull(t: DataType): DType

    Get the cuDF type for the Spark data type

    Get the cuDF type for the Spark data type

    t

    the Spark data type

    returns

    the cuDF type if the Shim supports

  35. def toScalarForType(t: DataType, v: Any): Nothing

    Convert the given value to Scalar

  36. def toString(): String
    Definition Classes
    AnyRef → Any
  37. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  38. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped