Packages

object ExprChecks

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. ExprChecks
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def aggNotGroupByOrReduction(outputCheck: TypeSig, sparkOutputSig: TypeSig, paramCheck: Seq[ParamCheck] = Seq.empty, repeatingParamCheck: Option[RepeatingParamCheck] = None): ExprChecks

    An aggregation check where window operations are supported by the plugin, but Spark also supports group by and reduction on these.

    An aggregation check where window operations are supported by the plugin, but Spark also supports group by and reduction on these. This is now really for 'collect_list' which is only supported by windowing.

  5. def aggNotWindow(outputCheck: TypeSig, sparkOutputSig: TypeSig, paramCheck: Seq[ParamCheck] = Seq.empty, repeatingParamCheck: Option[RepeatingParamCheck] = None): ExprChecks

    An aggregation check where group by and reduction are supported by the plugin, but Spark also supports window operations on these.

  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def binaryProjectNotLambda(outputCheck: TypeSig, sparkOutputSig: TypeSig, param1: (String, TypeSig, TypeSig), param2: (String, TypeSig, TypeSig)): ExprChecks

    Helper function for a binary expression where the plugin only supports project but Spark support lambda too.

  8. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. def fullAgg(outputCheck: TypeSig, sparkOutputSig: TypeSig, paramCheck: Seq[ParamCheck] = Seq.empty, repeatingParamCheck: Option[RepeatingParamCheck] = None): ExprChecks

    Aggregate operation where window, reduction, and group by agg are all supported the same.

  13. def fullAggAndProject(outputCheck: TypeSig, sparkOutputSig: TypeSig, paramCheck: Seq[ParamCheck] = Seq.empty, repeatingParamCheck: Option[RepeatingParamCheck] = None): ExprChecks

    For a generic expression that can work as both an aggregation and in the project context.

    For a generic expression that can work as both an aggregation and in the project context. This is really just for PythonUDF.

  14. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  15. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  16. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  17. val mathUnary: ExprChecks

    Math unary checks where input and output are both DoubleType.

    Math unary checks where input and output are both DoubleType. Spark supports these for both project and lambda, but the plugin only support project.

  18. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  19. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  20. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  21. def projectNotLambda(outputCheck: TypeSig, sparkOutputSig: TypeSig, paramCheck: Seq[ParamCheck] = Seq.empty, repeatingParamCheck: Option[RepeatingParamCheck] = None): ExprChecks

    A check for an expression that only supports project in the plugin, but Spark also supports this expression in lambda.

  22. def projectOnly(outputCheck: TypeSig, sparkOutputSig: TypeSig, paramCheck: Seq[ParamCheck] = Seq.empty, repeatingParamCheck: Option[RepeatingParamCheck] = None): ExprChecks

    A check for an expression that only supports project, both in Spark and in the plugin.

  23. def reductionAndGroupByAgg(outputCheck: TypeSig, sparkOutputSig: TypeSig, paramCheck: Seq[ParamCheck] = Seq.empty, repeatingParamCheck: Option[RepeatingParamCheck] = None): ExprChecks

    Aggregate operation where only group by agg and reduction is supported in the plugin and in Spark.

  24. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  25. def toString(): String
    Definition Classes
    AnyRef → Any
  26. def unaryProject(outputCheck: TypeSig, sparkOutputSig: TypeSig, inputCheck: TypeSig, sparkInputSig: TypeSig): ExprChecks

    A check for a unary expression that only support project both in Spark and the plugin.

  27. def unaryProjectNotLambda(outputCheck: TypeSig, sparkOutputSig: TypeSig, inputCheck: TypeSig, sparkInputSig: TypeSig): ExprChecks

    A check for a unary expression that only support project, but Spark also supports this expression in lambda.

  28. def unaryProjectNotLambdaInputMatchesOutput(check: TypeSig, sparkSig: TypeSig): ExprChecks

    Unary expression checks for project where the input matches the output, but Spark also supports this expression in lambda mode.

  29. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  32. def windowOnly(outputCheck: TypeSig, sparkOutputSig: TypeSig, paramCheck: Seq[ParamCheck] = Seq.empty, repeatingParamCheck: Option[RepeatingParamCheck] = None): ExprChecks

    Window only operations.

    Window only operations. Spark does not support these operations as anything but a window operation.

Inherited from AnyRef

Inherited from Any

Ungrouped