The data type for collections of multiple values.
An internal type used to represent everything that is not null, UDTs, arrays, structs, and maps.
The data type representing Array[Byte] values.
The data type representing Boolean values.
The data type representing Byte values.
The data type representing calendar time intervals.
Hive char type.
The base type of all Spark SQL data types.
A date type, supporting "0001-01-01" through "9999-12-31".
A mutable implementation of BigDecimal that can hold a Long if values are small enough.
The data type representing java.math.BigDecimal values.
The data type representing Double values.
The data type representing Float values.
A hive string type for compatibility.
The data type representing Int values.
The data type representing Long values.
The data type for Maps.
Metadata is a wrapper over Map[String, Any] that limits the value type to simple ones: Boolean, Long, Double, String, Metadata, Array[Boolean], Array[Long], Array[Double], Array[String], and Array[Metadata].
Builder for Metadata.
The data type representing NULL values.
Numeric data types.
Represents a JVM object that is passing through Spark SQL expression evaluation.
The data type representing Short values.
The data type representing String values.
A field inside a StructType.
A StructType object can be constructed by
The data type representing java.sql.Timestamp values.
Hive varchar type.
An AbstractDataType that matches any concrete data types.
Companion object for ArrayType.
Extra factory methods and pattern matchers for Decimals.
Metadata key used to store the raw hive type string in the metadata of StructField.
Metadata key used to store the raw hive type string in the metadata of StructField. This is relevant for datatypes that do not have a direct Spark SQL counterpart, such as CHAR and VARCHAR. We need to preserve the original type in order to invoke the correct object inspector in Hive.
Contains a type system for attributes produced by relations, including complex types like structs, arrays and maps.