Packages

object Types extends Logging

A helper class with mapping functions between Exasol JDBC types and Spark SQL types.

Linear Supertypes
Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Types
  2. Logging
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. val LongDecimal: DecimalType
  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  7. def convertSparkPrecisionScaleToExasol(decimalType: DecimalType): String

    Convert Spark Type with Decimal precision,scale to Exasol type.

    Convert Spark Type with Decimal precision,scale to Exasol type.

    For example:

    Spark.DecimalType(5,2) -> "DECIMAL(5,2)"

    Exasol has a max scale, precision of 36. Spark precision/scale greater than 36 will be truncated.

    decimalType

    A Spark DecimalType with precision and scale

    returns

    The equivalent Exasol type

  8. def createSparkStructType(rsmd: ResultSetMetaData): StructType

    Given a java.sql.ResultSetMetaData returns a Spark org.apache.spark.sql.types.StructType schema.

    Given a java.sql.ResultSetMetaData returns a Spark org.apache.spark.sql.types.StructType schema.

    rsmd

    A result set metadata

    returns

    A StructType matching result set types

  9. def createSparkTypeFromSQLType(sqlType: Int, precision: Int, scale: Int, isSigned: Boolean): DataType

    Maps a JDBC type java.sql.Types$ to a Spark SQL org.apache.spark.sql.types.DataType.

    Maps a JDBC type java.sql.Types$ to a Spark SQL org.apache.spark.sql.types.DataType.

    sqlType

    A JDBC type from java.sql.ResultSetMetaData column type

    precision

    A precision value obtained from ResultSetMetaData, rsmd.getPrecision(index)

    scale

    A scale value obtained from ResultSetMetaData, rsmd.getScale(index)

    isSigned

    A isSigned value obtained from ResultSetMetaData, rsmd.isSigned(index)

    returns

    A Spark SQL DataType corresponding to JDBC SQL type

  10. def createTableSchema(schema: StructType): String

    Returns comma separated column name and column types for Exasol table from Spark schema.

    Returns comma separated column name and column types for Exasol table from Spark schema.

    It skips the NOT NULL constraint if the Spark dataframe schema type is a org.apache.spark.sql.types.StringType$ type.

    schema

    A Spark org.apache.spark.sql.types.StructType schema

    returns

    A comma separated column names and their types

  11. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  12. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  13. def exasolTypeFromSparkDataType(dataType: DataType): String

    Returns corresponding Exasol type as a string for a given Spark org.apache.spark.sql.types.DataType type.

    Returns corresponding Exasol type as a string for a given Spark org.apache.spark.sql.types.DataType type.

    dataType

    A Spark DataType (e.g. org.apache.spark.sql.types.StringType$)

    returns

    A default Exasol type as string for this DataType

  14. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  15. def getMaxPrecisionExasol(): Int
  16. def getMaxScaleExasol(): Int
  17. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  18. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  19. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  20. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  21. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  22. def jdbcTypeFromSparkDataType(dataType: DataType): Int

    Returns corresponding Jdbc java.sql.Types$ type given Spark org.apache.spark.sql.types.DataType type.

    Returns corresponding Jdbc java.sql.Types$ type given Spark org.apache.spark.sql.types.DataType type.

    dataType

    A Spark DataType (e.g. org.apache.spark.sql.types.StringType$)

    returns

    A default JdbcType for this DataType

  23. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  24. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  25. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  26. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  27. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  28. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  29. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  30. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  31. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  32. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  35. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  36. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  37. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  38. def selectColumns(columns: Array[String], schema: StructType): StructType

    Select only required columns from Spark SQL schema.

    Select only required columns from Spark SQL schema.

    Adapted from Spark JDBCRDD private function pruneSchema.

    columns

    A list of required columns

    schema

    A Spark SQL schema

    returns

    A new Spark SQL schema with only columns in the order of column names

  39. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  40. def toString(): String
    Definition Classes
    AnyRef → Any
  41. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  42. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  43. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped