object ExplainPlan
- Alphabetic
- By Inheritance
- ExplainPlan
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
explainPotentialGpuPlan(df: DataFrame, explain: String = "ALL"): String
Looks at the CPU plan associated with the dataframe and outputs information about which parts of the query the RAPIDS Accelerator for Apache Spark could place on the GPU.
Looks at the CPU plan associated with the dataframe and outputs information about which parts of the query the RAPIDS Accelerator for Apache Spark could place on the GPU. This only applies to the initial plan, so if running with adaptive query execution enable, it will not be able to show any changes in the plan due to that.
This is very similar output you would get by running the query with the Rapids Accelerator enabled and with the config
spark.rapids.sql.enabledenabled.Requires the RAPIDS Accelerator for Apache Spark jar and RAPIDS cudf jar be included in the classpath but the RAPIDS Accelerator for Apache Spark should be disabled.
val output = com.nvidia.spark.rapids.ExplainPlan.explainPotentialGpuPlan(df)Calling from PySpark:
output = sc._jvm.com.nvidia.spark.rapids.ExplainPlan.explainPotentialGpuPlan(df._jdf, "ALL")- df
The Spark DataFrame to get the query plan from
- explain
If ALL returns all the explain data, otherwise just returns what does not work on the GPU. Default is ALL.
- returns
String containing the explained plan.
- Annotations
- @throws( ... ) @throws( ... )
- Exceptions thrown
java.lang.IllegalArgumentExceptionif an argument is invalid or it is unable to determine the Spark versionjava.lang.IllegalStateExceptionif the plugin gets into an invalid state while trying to process the plan or there is an unexepected exception.
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()