Packages

o

org.apache.spark.sql.delta

GeneratedColumn

object GeneratedColumn extends DeltaLogging with AnalysisHelper

Provide utility methods to implement Generated Columns for Delta. Users can use the following SQL syntax to create a table with generated columns.

CREATE TABLE table_identifier( column_name column_type, column_name column_type GENERATED ALWAYS AS ( generation_expr ), ... ) USING delta [ PARTITIONED BY (partition_column_name, ...) ]

This is an example: CREATE TABLE foo( id bigint, type string, subType string GENERATED ALWAYS AS ( SUBSTRING(type FROM 0 FOR 4) ), data string, eventTime timestamp, day date GENERATED ALWAYS AS ( days(eventTime) ) USING delta PARTITIONED BY (type, day)

When writing to a table, for these generated columns: - If the output is missing a generated column, we will add an expression to generate it. - If a generated column exists in the output, in other words, we will add a constraint to ensure the given value doesn't violate the generation expression.

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. GeneratedColumn
  2. AnalysisHelper
  3. DeltaLogging
  4. DatabricksLogging
  5. DeltaProgressReporter
  6. LoggingShims
  7. Logging
  8. AnyRef
  9. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. implicit class LogStringContext extends AnyRef
    Definition Classes
    LoggingShims

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  6. def deltaAssert(check: ⇒ Boolean, name: String, msg: String, deltaLog: DeltaLog = null, data: AnyRef = null, path: Option[Path] = None): Unit

    Helper method to check invariants in Delta code.

    Helper method to check invariants in Delta code. Fails when running in tests, records a delta assertion event and logs a warning otherwise.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  7. def enforcesGeneratedColumns(protocol: Protocol, metadata: Metadata): Boolean

    Whether the table has generated columns.

    Whether the table has generated columns. A table has generated columns only if its protocol satisfies Generated Column (listed in Table Features or supported implicitly) and some of columns in the table schema contain generation expressions.

    As Spark will propagate column metadata storing the generation expression through the entire plan, old versions that don't support generated columns may create tables whose schema contain generation expressions. However, since these old versions has a lower writer version, we can use the table's minWriterVersion to identify such tables and treat them as normal tables.

    protocol

    the table protocol.

    metadata

    the table metadata.

  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. def generatePartitionFilters(spark: SparkSession, snapshot: SnapshotDescriptor, dataFilters: Seq[Expression], delta: LogicalPlan): Seq[Expression]

    Try to generate partition filters from data filters if possible.

    Try to generate partition filters from data filters if possible.

    delta

    the logical plan that outputs the same attributes as the table schema. This will be used to resolve auto generated expressions.

  12. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. def getCommonTags(deltaLog: DeltaLog, tahoeId: String): Map[TagDefinition, String]
    Definition Classes
    DeltaLogging
  14. def getErrorData(e: Throwable): Map[String, Any]
    Definition Classes
    DeltaLogging
  15. def getGeneratedColumns(snapshot: SnapshotDescriptor): Seq[StructField]

    Returns the generated columns of a table.

    Returns the generated columns of a table. A column is a generated column requires: - The table writer protocol >= GeneratedColumn.MIN_WRITER_VERSION; - It has a generation expression in the column metadata.

  16. def getGeneratedColumnsAndColumnsUsedByGeneratedColumns(schema: StructType): Set[String]
  17. def getGenerationExpression(field: StructField): Option[Expression]

    Return the generation expression from a field if any.

    Return the generation expression from a field if any. This method doesn't check the protocl. The caller should make sure the table writer protocol meets satisfyGeneratedColumnProtocol before calling method.

  18. def getGenerationExpressionStr(metadata: Metadata): Option[String]

    Return the generation expression from a field metadata if any.

  19. def getOptimizablePartitionExpressions(schema: StructType, partitionSchema: StructType): Map[String, Seq[OptimizablePartitionExpression]]

    Try to get OptimizablePartitionExpressions of a data column when a partition column is defined as a generated column and refers to this data column.

    Try to get OptimizablePartitionExpressions of a data column when a partition column is defined as a generated column and refers to this data column.

    schema

    the table schema

    partitionSchema

    the partition schema. If a partition column is defined as a generated column, its column metadata should contain the generation expression.

  20. def hasGeneratedColumns(schema: StructType): Boolean

    Whether any generation expressions exist in the schema.

    Whether any generation expressions exist in the schema. Note: this doesn't mean the table contains generated columns. A table has generated columns only if its protocol satisfies Generated Column (listed in Table Features or supported implicitly) and some of columns in the table schema contain generation expressions. Use enforcesGeneratedColumns to check generated column tables instead.

  21. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  22. def improveUnsupportedOpError(f: ⇒ Unit): Unit
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  23. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  24. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  25. def isGeneratedColumn(protocol: Protocol, field: StructField): Boolean

    Whether a column is a generated column.

  26. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  27. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  28. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  29. def logConsole(line: String): Unit
    Definition Classes
    DatabricksLogging
  30. def logDebug(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  31. def logDebug(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  32. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logError(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  35. def logError(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  36. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  37. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def logInfo(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  39. def logInfo(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  40. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  41. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  42. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  43. def logTrace(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  44. def logTrace(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  45. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  46. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  47. def logWarning(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  48. def logWarning(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  49. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  50. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  51. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  52. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  53. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  54. def partitionFilterOptimizationEnabled(spark: SparkSession): Boolean
  55. def recordDeltaEvent(deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty, data: AnyRef = null, path: Option[Path] = None): Unit

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    path

    Used to log the path of the delta table when deltaLog is null.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  56. def recordDeltaOperation[A](deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  57. def recordDeltaOperationForTablePath[A](tablePath: String, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  58. def recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  59. def recordFrameProfile[T](group: String, name: String)(thunk: ⇒ T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  60. def recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = METRIC_OPERATION_DURATION, silent: Boolean = true)(thunk: ⇒ S): S
    Definition Classes
    DatabricksLogging
  61. def recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  62. def recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  63. def recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  64. def resolveReferencesForExpressions(sparkSession: SparkSession, exprs: Seq[Expression], planProvidingAttrs: LogicalPlan): Seq[Expression]

    Resolve expressions using the attributes provided by planProvidingAttrs.

    Resolve expressions using the attributes provided by planProvidingAttrs. Throw an error if failing to resolve any expressions.

    Attributes
    protected
    Definition Classes
    AnalysisHelper
  65. def satisfyGeneratedColumnProtocol(protocol: Protocol): Boolean
  66. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  67. def toDataset(sparkSession: SparkSession, logicalPlan: LogicalPlan): Dataset[Row]
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  68. def toString(): String
    Definition Classes
    AnyRef → Any
  69. def tryResolveReferences(sparkSession: SparkSession)(expr: Expression, planContainingExpr: LogicalPlan): Expression
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  70. def tryResolveReferencesForExpressions(sparkSession: SparkSession)(exprs: Seq[Expression], plansProvidingAttrs: Seq[LogicalPlan]): Seq[Expression]

    Resolve expressions using the attributes provided by planProvidingAttrs, ignoring errors.

    Resolve expressions using the attributes provided by planProvidingAttrs, ignoring errors.

    Attributes
    protected
    Definition Classes
    AnalysisHelper
  71. def tryResolveReferencesForExpressions(sparkSession: SparkSession, exprs: Seq[Expression], planContainingExpr: LogicalPlan): Seq[Expression]
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  72. def validateColumnReferences(spark: SparkSession, fieldName: String, expression: Expression, schema: StructType): Unit

    SPARK-27561 added support for lateral column alias.

    SPARK-27561 added support for lateral column alias. This means generation expressions that reference other generated columns no longer fail analysis in validateGeneratedColumns.

    This method checks for and throws an error if: - A generated column references itself - A generated column references another generated column

  73. def validateGeneratedColumns(spark: SparkSession, schema: StructType): Unit

    If the schema contains generated columns, check the following unsupported cases: - Refer to a non-existent column or another generated column.

    If the schema contains generated columns, check the following unsupported cases: - Refer to a non-existent column or another generated column. - Use an unsupported expression. - The expression type is not the same as the column type.

  74. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  75. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  76. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  77. def withStatusCode[T](statusCode: String, defaultMessage: String, data: Map[String, Any] = Map.empty)(body: ⇒ T): T

    Report a log to indicate some command is running.

    Report a log to indicate some command is running.

    Definition Classes
    DeltaProgressReporter

Inherited from AnalysisHelper

Inherited from DeltaLogging

Inherited from DatabricksLogging

Inherited from DeltaProgressReporter

Inherited from LoggingShims

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped