Packages

c

com.nvidia.spark.rapids.shims

Spark31XShims

abstract class Spark31XShims extends Spark31Xuntil33XShims with Logging

Linear Supertypes
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Spark31XShims
  2. Logging
  3. Spark31Xuntil33XShims
  4. SparkShims
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Spark31XShims()

Abstract Value Members

  1. abstract def getParquetFilters(schema: MessageType, pushDownDate: Boolean, pushDownTimestamp: Boolean, pushDownDecimal: Boolean, pushDownStartWith: Boolean, pushDownInFilterThreshold: Int, caseSensitive: Boolean, lookupFileMeta: (String) ⇒ String, dateTimeRebaseModeFromConf: String): ParquetFilters
    Definition Classes
    SparkShims

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def addExecBroadcastShuffle(p: SparkPlan): SparkPlan

    If the shim doesn't support executor broadcast, just return the plan passed in

    If the shim doesn't support executor broadcast, just return the plan passed in

    Definition Classes
    SparkShims
  5. def addRowShuffleToQueryStageTransitionIfNeeded(c2r: ColumnarToRowTransition, sqse: ShuffleQueryStageExec): SparkPlan

    Adds a row-based shuffle to the transititonal shuffle query stage if needed.

    Adds a row-based shuffle to the transititonal shuffle query stage if needed. This is needed when AQE plans a GPU shuffleexchange to be reused by a parent plan exec that consumes rows

    Definition Classes
    SparkShims
  6. def ansiCastRule: ExprRule[_ <: Expression]

    Return the replacement rule for AnsiCast.

    Return the replacement rule for AnsiCast. 'AnsiCast' is removed from Spark 3.4.0, so need to handle it separately.

    Definition Classes
    Spark31XShimsSparkShims
  7. def applyPostShimPlanRules(plan: SparkPlan): SparkPlan
    Definition Classes
    SparkShims
  8. def applyShimPlanRules(plan: SparkPlan, conf: RapidsConf): SparkPlan
    Definition Classes
    SparkShims
  9. def aqeShuffleReaderExec: ExecRule[_ <: SparkPlan]
    Definition Classes
    Spark31XShimsSparkShims
  10. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  11. def attachTreeIfSupported[TreeType <: TreeNode[_], A](tree: TreeType, msg: String)(f: ⇒ A): A

    dropped by SPARK-34234

    dropped by SPARK-34234

    Definition Classes
    Spark31XShimsSparkShims
  12. def avroRebaseReadKey: String
    Definition Classes
    Spark31XShimsSparkShims
  13. def avroRebaseWriteKey: String
    Definition Classes
    Spark31XShimsSparkShims
  14. def broadcastModeTransform(mode: BroadcastMode, rows: Array[InternalRow]): Any
    Definition Classes
    Spark31XShimsSparkShims
  15. def checkCToRWithExecBroadcastAQECoalPart(p: SparkPlan, parent: Option[SparkPlan]): Boolean
    Definition Classes
    SparkShims
  16. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  17. def columnarAdaptivePlan(a: AdaptiveSparkPlanExec, goal: CoalesceSizeGoal): SparkPlan
    Definition Classes
    Spark31XShimsSparkShims
  18. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  19. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  20. def filesFromFileIndex(fileIndex: PartitioningAwareFileIndex): Seq[FileStatus]
    Definition Classes
    Spark31XShimsSparkShims
  21. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  22. def findOperators(plan: SparkPlan, predicate: (SparkPlan) ⇒ Boolean): Seq[SparkPlan]

    Walk the plan recursively and return a list of operators that match the predicate

    Walk the plan recursively and return a list of operators that match the predicate

    Definition Classes
    Spark31XShimsSparkShims
  23. def getAdaptiveInputPlan(adaptivePlan: AdaptiveSparkPlanExec): SparkPlan
    Definition Classes
    Spark31XShimsSparkShims
  24. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  25. def getDataWriteCmds: Map[Class[_ <: DataWritingCommand], DataWritingCommandRule[_ <: DataWritingCommand]]
    Definition Classes
    Spark31Xuntil33XShimsSparkShims
  26. def getDateFormatter(): DateFormatter
    Definition Classes
    Spark31XShimsSparkShims
  27. def getExecs: Map[Class[_ <: SparkPlan], ExecRule[_ <: SparkPlan]]
    Definition Classes
    Spark31XShimsSparkShims
  28. def getExprs: Map[Class[_ <: Expression], ExprRule[_ <: Expression]]
    Definition Classes
    Spark31XShimsSparkShims
  29. def getFileScanRDD(sparkSession: SparkSession, readFunction: (PartitionedFile) ⇒ Iterator[InternalRow], filePartitions: Seq[FilePartition], readDataSchema: StructType, metadataColumns: Seq[AttributeReference], fileFormat: Option[FileFormat]): RDD[InternalRow]
    Definition Classes
    Spark31Xuntil33XShimsSparkShims
  30. def getRunnableCmds: Map[Class[_ <: RunnableCommand], RunnableCommandRule[_ <: RunnableCommand]]
    Definition Classes
    Spark31Xuntil33XShimsSparkShims
  31. def getScans: Map[Class[_ <: Scan], ScanRule[_ <: Scan]]
    Definition Classes
    Spark31XShimsSparkShims
  32. def getShuffleFromCToRWithExecBroadcastAQECoalPart(p: SparkPlan): Option[SparkPlan]
    Definition Classes
    SparkShims
  33. def hasAliasQuoteFix: Boolean
    Definition Classes
    Spark31XShimsSparkShims
  34. def hasCastFloatTimestampUpcast: Boolean
    Definition Classes
    Spark31XShimsSparkShims
  35. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  36. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  37. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def int96ParquetRebaseRead(conf: SQLConf): String
    Definition Classes
    Spark31XShimsSparkShims
  39. def int96ParquetRebaseReadKey: String
    Definition Classes
    Spark31XShimsSparkShims
  40. def int96ParquetRebaseWrite(conf: SQLConf): String
    Definition Classes
    Spark31XShimsSparkShims
  41. def int96ParquetRebaseWriteKey: String
    Definition Classes
    Spark31XShimsSparkShims
  42. def isAqePlan(p: SparkPlan): Boolean
    Definition Classes
    Spark31XShimsSparkShims
  43. def isCastingStringToNegDecimalScaleSupported: Boolean
    Definition Classes
    Spark31XShimsSparkShims
  44. def isCustomReaderExec(x: SparkPlan): Boolean
    Definition Classes
    Spark31XShimsSparkShims
  45. def isEmptyRelation(relation: Any): Boolean
    Definition Classes
    Spark31XShimsSparkShims
  46. def isExchangeOp(plan: SparkPlanMeta[_]): Boolean
    Definition Classes
    Spark31XShimsSparkShims
  47. def isExecutorBroadcastShuffle(shuffle: ShuffleExchangeLike): Boolean
    Definition Classes
    SparkShims
  48. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  49. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  50. def isWindowFunctionExec(plan: SparkPlan): Boolean
    Definition Classes
    Spark31XShimsSparkShims
  51. def leafNodeDefaultParallelism(ss: SparkSession): Int
    Definition Classes
    Spark31XShimsSparkShims
  52. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  53. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  54. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  55. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  56. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  57. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  58. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  59. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  60. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  61. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  62. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  63. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  64. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  65. def neverReplaceShowCurrentNamespaceCommand: ExecRule[_ <: SparkPlan]
    Definition Classes
    Spark31Xuntil33XShimsSparkShims
  66. def newBroadcastQueryStageExec(old: BroadcastQueryStageExec, newPlan: SparkPlan): BroadcastQueryStageExec
    Definition Classes
    Spark31XShimsSparkShims
  67. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  68. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  69. def parquetRebaseRead(conf: SQLConf): String
    Definition Classes
    Spark31XShimsSparkShims
  70. def parquetRebaseReadKey: String
    Definition Classes
    Spark31XShimsSparkShims
  71. def parquetRebaseWrite(conf: SQLConf): String
    Definition Classes
    Spark31XShimsSparkShims
  72. def parquetRebaseWriteKey: String
    Definition Classes
    Spark31XShimsSparkShims
  73. def reproduceEmptyStringBug: Boolean

    Handle regexp_replace inconsistency from https://issues.apache.org/jira/browse/SPARK-39107

    Handle regexp_replace inconsistency from https://issues.apache.org/jira/browse/SPARK-39107

    Definition Classes
    Spark31XShimsSparkShims
  74. def reusedExchangeExecPfn: PartialFunction[SparkPlan, ReusedExchangeExec]
    Definition Classes
    Spark31XShimsSparkShims
  75. def sessionFromPlan(plan: SparkPlan): SparkSession
    Definition Classes
    Spark31XShimsSparkShims
  76. def shouldFailDivOverflow: Boolean
    Definition Classes
    Spark31XShimsSparkShims
  77. def shuffleParentReadsShuffleData(shuffle: ShuffleExchangeLike, parent: SparkPlan): Boolean
    Definition Classes
    SparkShims
  78. def skipAssertIsOnTheGpu(plan: SparkPlan): Boolean

    Our tests, by default, will check that all operators are running on the GPU, but there are some operators that we do not translate to GPU plans, so we need a way to bypass the check for those.

    Our tests, by default, will check that all operators are running on the GPU, but there are some operators that we do not translate to GPU plans, so we need a way to bypass the check for those.

    Definition Classes
    Spark31XShimsSparkShims
  79. def supportsColumnarAdaptivePlans: Boolean

    Determine if the Spark version allows the supportsColumnar flag to be overridden in AdaptiveSparkPlanExec.

    Determine if the Spark version allows the supportsColumnar flag to be overridden in AdaptiveSparkPlanExec. This feature was introduced in Spark 3.2 as part of SPARK-35881.

    Definition Classes
    Spark31XShimsSparkShims
  80. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  81. def toString(): String
    Definition Classes
    AnyRef → Any
  82. def tryTransformIfEmptyRelation(mode: BroadcastMode): Option[Any]

    This call can produce an EmptyHashedRelation or an empty array, allowing the AQE rule EliminateJoinToEmptyRelation in Spark 3.1.x to optimize certain joins.

    This call can produce an EmptyHashedRelation or an empty array, allowing the AQE rule EliminateJoinToEmptyRelation in Spark 3.1.x to optimize certain joins.

    In Spark 3.2.0, the optimization is still performed (under AQEPropagateEmptyRelation), but the AQE optimizer is looking at the metrics for the query stage to determine if numRows == 0, and if so it can eliminate certain joins.

    The call is implemented only for Spark 3.1.x+. It is disabled in Databricks because it requires a task context to perform the BroadcastMode.transform call, but we'd like to call this from the driver.

    Definition Classes
    Spark31XShimsSparkShims
  83. def v1RepairTableCommand(tableName: TableIdentifier): RunnableCommand
    Definition Classes
    Spark31XShimsSparkShims
  84. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  85. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  86. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from Logging

Inherited from Spark31Xuntil33XShims

Inherited from SparkShims

Inherited from AnyRef

Inherited from Any

Ungrouped