object Types extends Logging
A helper class with mapping functions between Exasol JDBC types and Spark SQL types.
- Alphabetic
- By Inheritance
- Types
- Logging
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- val LongDecimal: DecimalType
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @HotSpotIntrinsicCandidate()
-
def
convertSparkPrecisionScaleToExasol(decimalType: DecimalType): String
Convert Spark Type with Decimal precision,scale to Exasol type.
Convert Spark Type with Decimal precision,scale to Exasol type.
For example:
Spark.DecimalType(5,2) -> "DECIMAL(5,2)"
Exasol has a max scale, precision of 36. Spark precision/scale greater than 36 will be truncated.
- decimalType
A Spark DecimalType with precision and scale
- returns
The equivalent Exasol type
-
def
createSparkStructType(rsmd: ResultSetMetaData): StructType
Given a java.sql.ResultSetMetaData returns a Spark org.apache.spark.sql.types.StructType schema.
Given a java.sql.ResultSetMetaData returns a Spark org.apache.spark.sql.types.StructType schema.
- rsmd
A result set metadata
- returns
A StructType matching result set types
-
def
createSparkTypeFromSQLType(sqlType: Int, precision: Int, scale: Int, isSigned: Boolean): DataType
Maps a JDBC type java.sql.Types$ to a Spark SQL org.apache.spark.sql.types.DataType.
Maps a JDBC type java.sql.Types$ to a Spark SQL org.apache.spark.sql.types.DataType.
- sqlType
A JDBC type from java.sql.ResultSetMetaData column type
- precision
A precision value obtained from ResultSetMetaData,
rsmd.getPrecision(index)- scale
A scale value obtained from ResultSetMetaData,
rsmd.getScale(index)- isSigned
A isSigned value obtained from ResultSetMetaData,
rsmd.isSigned(index)- returns
A Spark SQL DataType corresponding to JDBC SQL type
-
def
createTableSchema(schema: StructType): String
Returns comma separated column name and column types for Exasol table from Spark schema.
Returns comma separated column name and column types for Exasol table from Spark schema.
It skips the
NOT NULLconstraint if the Spark dataframe schema type is a org.apache.spark.sql.types.StringType$ type.- schema
A Spark org.apache.spark.sql.types.StructType schema
- returns
A comma separated column names and their types
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
exasolTypeFromSparkDataType(dataType: DataType): String
Returns corresponding Exasol type as a string for a given Spark org.apache.spark.sql.types.DataType type.
Returns corresponding Exasol type as a string for a given Spark org.apache.spark.sql.types.DataType type.
- dataType
A Spark DataType (e.g. org.apache.spark.sql.types.StringType$)
- returns
A default Exasol type as string for this DataType
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- def getMaxPrecisionExasol(): Int
- def getMaxScaleExasol(): Int
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
initializeLogIfNecessary(isInterpreter: Boolean): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
isTraceEnabled(): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
jdbcTypeFromSparkDataType(dataType: DataType): Int
Returns corresponding Jdbc java.sql.Types$ type given Spark org.apache.spark.sql.types.DataType type.
Returns corresponding Jdbc java.sql.Types$ type given Spark org.apache.spark.sql.types.DataType type.
- dataType
A Spark DataType (e.g. org.apache.spark.sql.types.StringType$)
- returns
A default JdbcType for this DataType
-
def
log: Logger
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logName: String
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
selectColumns(columns: Array[String], schema: StructType): StructType
Select only required columns from Spark SQL schema.
Select only required columns from Spark SQL schema.
Adapted from Spark JDBCRDD private function
pruneSchema.- columns
A list of required columns
- schema
A Spark SQL schema
- returns
A new Spark SQL schema with only columns in the order of column names
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated