Packages

object TypeSig

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. TypeSig
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. val ARRAY: TypeSig

    ARRAY type support, but not very useful on its own because no nested types under it are supported

  5. val BINARY: TypeSig
  6. val BOOLEAN: TypeSig
  7. val BYTE: TypeSig
  8. val CALENDAR: TypeSig
  9. val DATE: TypeSig
  10. val DECIMAL: TypeSig
  11. val DOUBLE: TypeSig
  12. val FLOAT: TypeSig
  13. val INT: TypeSig
  14. val LONG: TypeSig
  15. val MAP: TypeSig

    MAP type support, but not very useful on its own because no nested types under it are supported

  16. val NULL: TypeSig
  17. val SHORT: TypeSig
  18. val STRING: TypeSig
  19. val STRUCT: TypeSig

    STRUCT type support, but only matches empty structs unless you add nested types to it.

  20. val TIMESTAMP: TypeSig
  21. val UDT: TypeSig

    User Defined Type (We don't support these in the plugin yet)

  22. val all: TypeSig

    All types nested and not nested

  23. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  24. val atomics: TypeSig

    All values that correspond to Spark's AtomicType

  25. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  26. val commonCudfTypes: TypeSig

    A signature for types that are generally supported by the plugin/CUDF.

    A signature for types that are generally supported by the plugin/CUDF. Please make sure to check what Spark actually supports instead of blindly using this in a signature.

  27. val comparable: TypeSig

    All types that Spark supports for comparison operators (really everything but MAP according to https://spark.apache.org/docs/latest/api/sql/index.html#_12), e.g.

    All types that Spark supports for comparison operators (really everything but MAP according to https://spark.apache.org/docs/latest/api/sql/index.html#_12), e.g. "<=>", "=", "==".

  28. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  29. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  30. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  31. val fp: TypeSig

    All floating point types

  32. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  33. def getDataType(expr: Expression): Option[DataType]
  34. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  35. val integral: TypeSig

    All integer types

  36. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  37. def lit(dataType: TypeEnum.Value): TypeSig

    Create a TypeSig that only supports a literal of the given type.

  38. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  39. val none: TypeSig

    No types supported at all

  40. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  41. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  42. val numeric: TypeSig

    All numeric types fp + integral + DECIMAL

  43. val numericAndInterval: TypeSig

    numeric + CALENDAR

  44. val orderable: TypeSig

    All types that Spark supports sorting/ordering on (really everything but MAP)

  45. def psNote(dataType: TypeEnum.Value, note: String): TypeSig

    Create a TypeSig that has partial support for the given type.

  46. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  47. def toString(): String
    Definition Classes
    AnyRef → Any
  48. val unionOfPandasUdfOut: TypeSig

    Different types of Pandas UDF support different sets of output type.

    Different types of Pandas UDF support different sets of output type. Please refer to https://github.com/apache/spark/blob/master/python/pyspark/sql/udf.py#L98 for more details.

    It is impossible to specify the exact type signature for each Pandas UDF type in a single expression 'PythonUDF'.

    So here comes the union of all the sets of supported type, to cover all the cases.

  49. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  50. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  51. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped