Packages

case class WriteIntoDelta(deltaLog: DeltaLog, mode: SaveMode, options: DeltaOptions, partitionColumns: Seq[String], configuration: Map[String, String], data: DataFrame, catalogTableOpt: Option[CatalogTable] = None, schemaInCatalog: Option[StructType] = None) extends LogicalPlan with LeafRunnableCommand with ImplicitMetadataOperation with DeltaCommand with WriteIntoDeltaLike with Product with Serializable

Used to write a DataFrame into a delta table.

New Table Semantics

  • The schema of the DataFrame is used to initialize the table.
  • The partition columns will be used to partition the table.

Existing Table Semantics

  • The save mode will control how existing data is handled (i.e. overwrite, append, etc)
  • The schema of the DataFrame will be checked and if there are new columns present they will be added to the tables schema. Conflicting columns (i.e. a INT, and a STRING) will result in an exception
  • The partition columns, if present are validated against the existing metadata. If not present, then the partitioning of the table is respected.

In combination with Overwrite, a replaceWhere option can be used to transactionally replace data that matches a predicate.

In combination with Overwrite dynamic partition overwrite mode (option partitionOverwriteMode set to dynamic, or in spark conf spark.sql.sources.partitionOverwriteMode set to dynamic) is also supported.

Dynamic partition overwrite mode conflicts with replaceWhere:

  • If a replaceWhere option is provided, and dynamic partition overwrite mode is enabled in the DataFrameWriter options, an error will be thrown.
  • If a replaceWhere option is provided, and dynamic partition overwrite mode is enabled in the spark conf, data will be overwritten according to the replaceWhere expression
catalogTableOpt

Should explicitly be set when table is accessed from catalog

schemaInCatalog

The schema created in Catalog. We will use this schema to update metadata when it is set (in CTAS code path), and otherwise use schema from data.

Linear Supertypes
Serializable, Serializable, WriteIntoDeltaLike, DeltaCommand, ImplicitMetadataOperation, DeltaLogging, DatabricksLogging, DeltaProgressReporter, LoggingShims, LeafRunnableCommand, LeafLike[LogicalPlan], RunnableCommand, Command, LogicalPlan, Logging, QueryPlanConstraints, ConstraintHelper, LogicalPlanDistinctKeys, LogicalPlanStats, AnalysisHelper, QueryPlan[LogicalPlan], SQLConfHelper, TreeNode[LogicalPlan], WithOrigin, TreePatternBits, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. WriteIntoDelta
  2. Serializable
  3. Serializable
  4. WriteIntoDeltaLike
  5. DeltaCommand
  6. ImplicitMetadataOperation
  7. DeltaLogging
  8. DatabricksLogging
  9. DeltaProgressReporter
  10. LoggingShims
  11. LeafRunnableCommand
  12. LeafLike
  13. RunnableCommand
  14. Command
  15. LogicalPlan
  16. Logging
  17. QueryPlanConstraints
  18. ConstraintHelper
  19. LogicalPlanDistinctKeys
  20. LogicalPlanStats
  21. AnalysisHelper
  22. QueryPlan
  23. SQLConfHelper
  24. TreeNode
  25. WithOrigin
  26. TreePatternBits
  27. Product
  28. Equals
  29. AnyRef
  30. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new WriteIntoDelta(deltaLog: DeltaLog, mode: SaveMode, options: DeltaOptions, partitionColumns: Seq[String], configuration: Map[String, String], data: DataFrame, catalogTableOpt: Option[CatalogTable] = None, schemaInCatalog: Option[StructType] = None)

    catalogTableOpt

    Should explicitly be set when table is accessed from catalog

    schemaInCatalog

    The schema created in Catalog. We will use this schema to update metadata when it is set (in CTAS code path), and otherwise use schema from data.

Type Members

  1. implicit class LogStringContext extends AnyRef
    Definition Classes
    LoggingShims

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. lazy val allAttributes: AttributeSeq
    Definition Classes
    QueryPlan
  5. def analyzed: Boolean
    Definition Classes
    AnalysisHelper
  6. def apply(number: Int): TreeNode[_]
    Definition Classes
    TreeNode
  7. def argString(maxFields: Int): String
    Definition Classes
    TreeNode
  8. def asCode: String
    Definition Classes
    TreeNode
  9. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  10. def assertNotAnalysisRule(): Unit
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  11. def buildBaseRelation(spark: SparkSession, txn: OptimisticTransaction, actionType: String, rootPath: Path, inputLeafFiles: Seq[String], nameToAddFileMap: Map[String, AddFile]): HadoopFsRelation

    Build a base relation of files that need to be rewritten as part of an update/delete/merge operation.

    Build a base relation of files that need to be rewritten as part of an update/delete/merge operation.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  12. val canMergeSchema: Boolean
    Attributes
    protected
    Definition Classes
    WriteIntoDeltaImplicitMetadataOperation
  13. val canOverwriteSchema: Boolean
    Attributes
    protected
    Definition Classes
    WriteIntoDeltaImplicitMetadataOperation
  14. final lazy val canonicalized: LogicalPlan
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  15. val catalogTableOpt: Option[CatalogTable]
  16. final def children: Seq[LogicalPlan]
    Definition Classes
    LeafLike
  17. def childrenResolved: Boolean
    Definition Classes
    LogicalPlan
  18. def clone(): LogicalPlan
    Definition Classes
    AnalysisHelper → TreeNode → AnyRef
  19. def collect[B](pf: PartialFunction[LogicalPlan, B]): Seq[B]
    Definition Classes
    TreeNode
  20. def collectFirst[B](pf: PartialFunction[LogicalPlan, B]): Option[B]
    Definition Classes
    TreeNode
  21. def collectLeaves(): Seq[LogicalPlan]
    Definition Classes
    TreeNode
  22. def collectWithSubqueries[B](f: PartialFunction[LogicalPlan, B]): Seq[B]
    Definition Classes
    QueryPlan
  23. def conf: SQLConf
    Definition Classes
    SQLConfHelper
  24. val configuration: Map[String, String]

    The configuration to be used for writing data into Delta table.

    The configuration to be used for writing data into Delta table.

    Definition Classes
    WriteIntoDeltaWriteIntoDeltaLike
  25. lazy val constraints: ExpressionSet
    Definition Classes
    QueryPlanConstraints
  26. def constructIsNotNullConstraints(constraints: ExpressionSet, output: Seq[Attribute]): ExpressionSet
    Definition Classes
    ConstraintHelper
  27. final def containsAllPatterns(patterns: TreePattern*): Boolean
    Definition Classes
    TreePatternBits
  28. final def containsAnyPattern(patterns: TreePattern*): Boolean
    Definition Classes
    TreePatternBits
  29. lazy val containsChild: Set[TreeNode[_]]
    Definition Classes
    TreeNode
  30. final def containsPattern(t: TreePattern): Boolean
    Definition Classes
    TreePatternBits
    Annotations
    @inline()
  31. def copyTagsFrom(other: LogicalPlan): Unit
    Definition Classes
    TreeNode
  32. def createSetTransaction(sparkSession: SparkSession, deltaLog: DeltaLog, options: Option[DeltaOptions] = None): Option[SetTransaction]

    Returns SetTransaction if a valid app ID and version are present.

    Returns SetTransaction if a valid app ID and version are present. Otherwise returns an empty list.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  33. val data: DataFrame

    Data to be written into Delta table.

    Data to be written into Delta table.

    Definition Classes
    WriteIntoDeltaWriteIntoDeltaLike
  34. def deltaAssert(check: ⇒ Boolean, name: String, msg: String, deltaLog: DeltaLog = null, data: AnyRef = null, path: Option[Path] = None): Unit

    Helper method to check invariants in Delta code.

    Helper method to check invariants in Delta code. Fails when running in tests, records a delta assertion event and logs a warning otherwise.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  35. val deltaLog: DeltaLog
    Definition Classes
    WriteIntoDeltaWriteIntoDeltaLike
  36. lazy val deterministic: Boolean
    Definition Classes
    QueryPlan
  37. lazy val distinctKeys: Set[ExpressionSet]
    Definition Classes
    LogicalPlanDistinctKeys
  38. def doCanonicalize(): LogicalPlan
    Attributes
    protected
    Definition Classes
    QueryPlan
  39. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  40. def exists(f: (LogicalPlan) ⇒ Boolean): Boolean
    Definition Classes
    TreeNode
  41. final def expressions: Seq[Expression]
    Definition Classes
    QueryPlan
  42. def extractConstraints(sparkSession: SparkSession, expr: Seq[Expression]): Seq[Constraint]
    Attributes
    protected
    Definition Classes
    WriteIntoDeltaLike
  43. def fastEquals(other: TreeNode[_]): Boolean
    Definition Classes
    TreeNode
  44. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  45. def find(f: (LogicalPlan) ⇒ Boolean): Option[LogicalPlan]
    Definition Classes
    TreeNode
  46. def flatMap[A](f: (LogicalPlan) ⇒ TraversableOnce[A]): Seq[A]
    Definition Classes
    TreeNode
  47. def foreach(f: (LogicalPlan) ⇒ Unit): Unit
    Definition Classes
    TreeNode
  48. def foreachUp(f: (LogicalPlan) ⇒ Unit): Unit
    Definition Classes
    TreeNode
  49. def formattedNodeName: String
    Attributes
    protected
    Definition Classes
    QueryPlan
  50. def generateCandidateFileMap(basePath: Path, candidateFiles: Seq[AddFile]): Map[String, AddFile]

    Generates a map of file names to add file entries for operations where we will need to rewrite files such as delete, merge, update.

    Generates a map of file names to add file entries for operations where we will need to rewrite files such as delete, merge, update. We expect file names to be unique, because each file contains a UUID.

    Definition Classes
    DeltaCommand
  51. def generateTreeString(depth: Int, lastChildren: ArrayList[Boolean], append: (String) ⇒ Unit, verbose: Boolean, prefix: String, addSuffix: Boolean, maxFields: Int, printNodeId: Boolean, indent: Int): Unit
    Definition Classes
    TreeNode
  52. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  53. def getCommonTags(deltaLog: DeltaLog, tahoeId: String): Map[TagDefinition, String]
    Definition Classes
    DeltaLogging
  54. def getDefaultTreePatternBits: BitSet
    Attributes
    protected
    Definition Classes
    TreeNode
  55. def getDeltaLog(spark: SparkSession, path: Option[String], tableIdentifier: Option[TableIdentifier], operationName: String, hadoopConf: Map[String, String] = Map.empty): DeltaLog

    Utility method to return the DeltaLog of an existing Delta table referred by either the given path or tableIdentifier.

    Utility method to return the DeltaLog of an existing Delta table referred by either the given path or tableIdentifier.

    spark

    SparkSession reference to use.

    path

    Table location. Expects a non-empty tableIdentifier or path.

    tableIdentifier

    Table identifier. Expects a non-empty tableIdentifier or path.

    operationName

    Operation that is getting the DeltaLog, used in error messages.

    hadoopConf

    Hadoop file system options used to build DeltaLog.

    returns

    DeltaLog of the table

    Attributes
    protected
    Definition Classes
    DeltaCommand
    Exceptions thrown

    AnalysisException If either no Delta table exists at the given path/identifier or there is neither path nor tableIdentifier is provided.

  56. def getDeltaTable(target: LogicalPlan, cmd: String): DeltaTableV2

    Extracts the DeltaTableV2 from a LogicalPlan iff the LogicalPlan is a ResolvedTable with either a DeltaTableV2 or a V1Table that is referencing a Delta table.

    Extracts the DeltaTableV2 from a LogicalPlan iff the LogicalPlan is a ResolvedTable with either a DeltaTableV2 or a V1Table that is referencing a Delta table. In all other cases this method will throw a "Table not found" exception.

    Definition Classes
    DeltaCommand
  57. def getDeltaTablePathOrIdentifier(target: LogicalPlan, cmd: String): (Option[TableIdentifier], Option[String])

    Helper method to extract the table id or path from a LogicalPlan representing a Delta table.

    Helper method to extract the table id or path from a LogicalPlan representing a Delta table. This uses DeltaCommand.getDeltaTable to convert the LogicalPlan to a DeltaTableV2 and then extracts either the path or identifier from it. If the DeltaTableV2 has a CatalogTable, the table identifier will be returned. Otherwise, the table's path will be returned. Throws an exception if the LogicalPlan does not represent a Delta table.

    Definition Classes
    DeltaCommand
  58. def getErrorData(e: Throwable): Map[String, Any]
    Definition Classes
    DeltaLogging
  59. def getMetadataAttributeByName(name: String): AttributeReference
    Definition Classes
    LogicalPlan
  60. def getMetadataAttributeByNameOpt(name: String): Option[AttributeReference]
    Definition Classes
    LogicalPlan
  61. final def getNewDomainMetadata(txn: OptimisticTransaction, canUpdateMetadata: Boolean, isReplacingTable: Boolean, clusterBySpecOpt: Option[ClusterBySpec] = None): Seq[DomainMetadata]

    Returns a sequence of new DomainMetadata if canUpdateMetadata is true and the operation is either create table or replace the whole table (not replaceWhere operation).

    Returns a sequence of new DomainMetadata if canUpdateMetadata is true and the operation is either create table or replace the whole table (not replaceWhere operation). This is because we only update Domain Metadata when creating or replacing table, and replace table for DDL and DataFrameWriterV2 are already handled in CreateDeltaTableCommand. In that case, canUpdateMetadata is false, so we don't update again.

    txn

    OptimisticTransaction being used to create or replace table.

    canUpdateMetadata

    true if the metadata is not updated yet.

    isReplacingTable

    true if the operation is replace table without replaceWhere option.

    clusterBySpecOpt

    optional ClusterBySpec containing user-specified clustering columns.

    Attributes
    protected
    Definition Classes
    ImplicitMetadataOperation
  62. def getTableCatalogTable(target: LogicalPlan, cmd: String): Option[CatalogTable]

    Extracts CatalogTable metadata from a LogicalPlan if the plan is a ResolvedTable.

    Extracts CatalogTable metadata from a LogicalPlan if the plan is a ResolvedTable. The table can be a non delta table.

    Definition Classes
    DeltaCommand
  63. def getTablePathOrIdentifier(target: LogicalPlan, cmd: String): (Option[TableIdentifier], Option[String])

    Helper method to extract the table id or path from a LogicalPlan representing a resolved table or path.

    Helper method to extract the table id or path from a LogicalPlan representing a resolved table or path. This calls getDeltaTablePathOrIdentifier if the resolved table is a delta table. For non delta table with identifier, we extract its identifier. For non delta table with path, it expects the path to be wrapped in an ResolvedPathBasedNonDeltaTable and extracts it from there.

    Definition Classes
    DeltaCommand
  64. def getTagValue[T](tag: TreeNodeTag[T]): Option[T]
    Definition Classes
    TreeNode
  65. def getTouchedFile(basePath: Path, escapedFilePath: String, nameToAddFileMap: Map[String, AddFile]): AddFile

    Find the AddFile record corresponding to the file that was read as part of a delete/update/merge operation.

    Find the AddFile record corresponding to the file that was read as part of a delete/update/merge operation.

    basePath

    The path of the table. Must not be escaped.

    escapedFilePath

    The path to a file that can be either absolute or relative. All special chars in this path must be already escaped by URI standards.

    nameToAddFileMap

    Map generated through generateCandidateFileMap().

    Definition Classes
    DeltaCommand
  66. def hasBeenExecuted(txn: OptimisticTransaction, sparkSession: SparkSession, options: Option[DeltaOptions] = None): Boolean

    Returns true if there is information in the spark session that indicates that this write has already been successfully written.

    Returns true if there is information in the spark session that indicates that this write has already been successfully written.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  67. def hashCode(): Int
    Definition Classes
    TreeNode → AnyRef → Any
  68. def inferAdditionalConstraints(constraints: ExpressionSet): ExpressionSet
    Definition Classes
    ConstraintHelper
  69. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  70. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  71. def innerChildren: Seq[QueryPlan[_]]
    Definition Classes
    QueryPlan → TreeNode
  72. def inputSet: AttributeSet
    Definition Classes
    QueryPlan
  73. final def invalidateStatsCache(): Unit
    Definition Classes
    LogicalPlanStats
  74. def isCanonicalizedPlan: Boolean
    Attributes
    protected
    Definition Classes
    QueryPlan
  75. def isCatalogTable(analyzer: Analyzer, tableIdent: TableIdentifier): Boolean

    Use the analyzer to see whether the provided TableIdentifier is for a path based table or not

    Use the analyzer to see whether the provided TableIdentifier is for a path based table or not

    analyzer

    The session state analyzer to call

    tableIdent

    Table Identifier to determine whether is path based or not

    returns

    Boolean where true means that the table is a table in a metastore and false means the table is a path based table

    Definition Classes
    DeltaCommand
  76. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  77. def isPathIdentifier(tableIdent: TableIdentifier): Boolean

    Checks if the given identifier can be for a delta table's path

    Checks if the given identifier can be for a delta table's path

    tableIdent

    Table Identifier for which to check

    Attributes
    protected
    Definition Classes
    DeltaCommand
  78. def isRuleIneffective(ruleId: RuleId): Boolean
    Attributes
    protected
    Definition Classes
    TreeNode
  79. def isStreaming: Boolean
    Definition Classes
    LogicalPlan
  80. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  81. def jsonFields: List[JField]
    Attributes
    protected
    Definition Classes
    TreeNode
  82. final def legacyWithNewChildren(newChildren: Seq[LogicalPlan]): LogicalPlan
    Attributes
    protected
    Definition Classes
    TreeNode
  83. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  84. def logConsole(line: String): Unit
    Definition Classes
    DatabricksLogging
  85. def logDebug(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  86. def logDebug(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  87. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  88. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  89. def logError(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  90. def logError(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  91. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  92. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  93. def logInfo(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  94. def logInfo(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  95. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  96. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  97. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  98. def logTrace(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  99. def logTrace(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  100. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  101. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  102. def logWarning(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  103. def logWarning(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  104. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  105. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  106. def makeCopy(newArgs: Array[AnyRef]): LogicalPlan
    Definition Classes
    TreeNode
  107. def map[A](f: (LogicalPlan) ⇒ A): Seq[A]
    Definition Classes
    TreeNode
  108. final def mapChildren(f: (LogicalPlan) ⇒ LogicalPlan): LogicalPlan
    Definition Classes
    LeafLike
  109. def mapExpressions(f: (Expression) ⇒ Expression): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  110. def mapProductIterator[B](f: (Any) ⇒ B)(implicit arg0: ClassTag[B]): Array[B]
    Attributes
    protected
    Definition Classes
    TreeNode
  111. def markRuleAsIneffective(ruleId: RuleId): Unit
    Attributes
    protected
    Definition Classes
    TreeNode
  112. def maxRows: Option[Long]
    Definition Classes
    LogicalPlan
  113. def maxRowsPerPartition: Option[Long]
    Definition Classes
    LogicalPlan
  114. def metadataOutput: Seq[Attribute]
    Definition Classes
    LogicalPlan
  115. lazy val metrics: Map[String, SQLMetric]
    Definition Classes
    RunnableCommand
  116. final def missingInput: AttributeSet
    Definition Classes
    QueryPlan
  117. val mode: SaveMode
  118. def multiTransformDown(rule: PartialFunction[LogicalPlan, Seq[LogicalPlan]]): Stream[LogicalPlan]
    Definition Classes
    TreeNode
  119. def multiTransformDownWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[LogicalPlan, Seq[LogicalPlan]]): Stream[LogicalPlan]
    Definition Classes
    TreeNode
  120. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  121. def nodeName: String
    Definition Classes
    TreeNode
  122. final val nodePatterns: Seq[TreePattern]
    Definition Classes
    Command → TreeNode
  123. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  124. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  125. def numberedTreeString: String
    Definition Classes
    TreeNode
  126. val options: DeltaOptions
  127. val origin: Origin
    Definition Classes
    TreeNode → WithOrigin
  128. def otherCopyArgs: Seq[AnyRef]
    Attributes
    protected
    Definition Classes
    TreeNode
  129. def output: Seq[Attribute]
    Definition Classes
    Command → QueryPlan
  130. def outputOrdering: Seq[SortOrder]
    Definition Classes
    QueryPlan
  131. lazy val outputSet: AttributeSet
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  132. def p(number: Int): LogicalPlan
    Definition Classes
    TreeNode
  133. def parsePredicates(spark: SparkSession, predicate: String): Seq[Expression]

    Converts string predicates into Expressions relative to a transaction.

    Converts string predicates into Expressions relative to a transaction.

    Attributes
    protected
    Definition Classes
    DeltaCommand
    Exceptions thrown

    AnalysisException if a non-partition column is referenced.

  134. val partitionColumns: Seq[String]
  135. def prettyJson: String
    Definition Classes
    TreeNode
  136. def printSchema(): Unit
    Definition Classes
    QueryPlan
  137. def producedAttributes: AttributeSet
    Definition Classes
    Command → QueryPlan
  138. def recordDeltaEvent(deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty, data: AnyRef = null, path: Option[Path] = None): Unit

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    path

    Used to log the path of the delta table when deltaLog is null.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  139. def recordDeltaOperation[A](deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  140. def recordDeltaOperationForTablePath[A](tablePath: String, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  141. def recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  142. def recordFrameProfile[T](group: String, name: String)(thunk: ⇒ T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  143. def recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = METRIC_OPERATION_DURATION, silent: Boolean = true)(thunk: ⇒ S): S
    Definition Classes
    DatabricksLogging
  144. def recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  145. def recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  146. def recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  147. lazy val references: AttributeSet
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  148. def refresh(): Unit
    Definition Classes
    LogicalPlan
  149. def registerReplaceWhereMetrics(spark: SparkSession, txn: OptimisticTransaction, newFiles: Seq[Action], deleteActions: Seq[Action]): Unit

    Replace where operationMetrics need to be recorded separately.

    Replace where operationMetrics need to be recorded separately.

    newFiles

    - AddFile and AddCDCFile added by write job

    deleteActions

    - AddFile, RemoveFile, AddCDCFile added by Delete job

    Attributes
    protected
    Definition Classes
    WriteIntoDeltaLike
  150. def removeFilesFromPaths(deltaLog: DeltaLog, nameToAddFileMap: Map[String, AddFile], filesToRewrite: Seq[String], operationTimestamp: Long): Seq[RemoveFile]

    This method provides the RemoveFile actions that are necessary for files that are touched and need to be rewritten in methods like Delete, Update, and Merge.

    This method provides the RemoveFile actions that are necessary for files that are touched and need to be rewritten in methods like Delete, Update, and Merge.

    deltaLog

    The DeltaLog of the table that is being operated on

    nameToAddFileMap

    A map generated using generateCandidateFileMap.

    filesToRewrite

    Absolute paths of the files that were touched. We will search for these in candidateFiles. Obtained as the output of the input_file_name function.

    operationTimestamp

    The timestamp of the operation

    Attributes
    protected
    Definition Classes
    DeltaCommand
  151. def resolve(nameParts: Seq[String], resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  152. def resolve(schema: StructType, resolver: Resolver): Seq[Attribute]
    Definition Classes
    LogicalPlan
  153. def resolveChildren(nameParts: Seq[String], resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  154. def resolveExpressions(r: PartialFunction[Expression, Expression]): LogicalPlan
    Definition Classes
    AnalysisHelper
  155. def resolveExpressionsWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): LogicalPlan
    Definition Classes
    AnalysisHelper
  156. def resolveIdentifier(analyzer: Analyzer, identifier: TableIdentifier): LogicalPlan

    Use the analyzer to resolve the identifier provided

    Use the analyzer to resolve the identifier provided

    analyzer

    The session state analyzer to call

    identifier

    Table Identifier to determine whether is path based or not

    Attributes
    protected
    Definition Classes
    DeltaCommand
  157. def resolveOperators(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  158. def resolveOperatorsDown(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  159. def resolveOperatorsDownWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  160. def resolveOperatorsUp(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  161. def resolveOperatorsUpWithNewOutput(rule: PartialFunction[LogicalPlan, (LogicalPlan, Seq[(Attribute, Attribute)])]): LogicalPlan
    Definition Classes
    AnalysisHelper
  162. def resolveOperatorsUpWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  163. def resolveOperatorsWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  164. def resolveQuoted(name: String, resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  165. lazy val resolved: Boolean
    Definition Classes
    LogicalPlan
  166. def rewriteAttrs(attrMap: AttributeMap[Attribute]): LogicalPlan
    Definition Classes
    QueryPlan
  167. def run(sparkSession: SparkSession): Seq[Row]
    Definition Classes
    WriteIntoDelta → RunnableCommand
  168. def sameOutput(other: LogicalPlan): Boolean
    Definition Classes
    LogicalPlan
  169. final def sameResult(other: LogicalPlan): Boolean
    Definition Classes
    QueryPlan
  170. lazy val schema: StructType
    Definition Classes
    QueryPlan
  171. val schemaInCatalog: Option[StructType]
  172. def schemaString: String
    Definition Classes
    QueryPlan
  173. final def semanticHash(): Int
    Definition Classes
    QueryPlan
  174. def sendDriverMetrics(spark: SparkSession, metrics: Map[String, SQLMetric]): Unit

    Send the driver-side metrics.

    Send the driver-side metrics.

    This is needed to make the SQL metrics visible in the Spark UI. All metrics are default initialized with 0 so that's what we're reporting in case we skip an already executed action.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  175. def setTagValue[T](tag: TreeNodeTag[T], value: T): Unit
    Definition Classes
    TreeNode
  176. def simpleString(maxFields: Int): String
    Definition Classes
    QueryPlan → TreeNode
  177. def simpleStringWithNodeId(): String
    Definition Classes
    QueryPlan → TreeNode
  178. def statePrefix: String
    Attributes
    protected
    Definition Classes
    LogicalPlan → QueryPlan
  179. def stats: Statistics
    Definition Classes
    Command → LogicalPlanStats
  180. val statsCache: Option[Statistics]
    Attributes
    protected
    Definition Classes
    LogicalPlanStats
  181. def stringArgs: Iterator[Any]
    Attributes
    protected
    Definition Classes
    TreeNode
  182. lazy val subqueries: Seq[LogicalPlan]
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  183. def subqueriesAll: Seq[LogicalPlan]
    Definition Classes
    QueryPlan
  184. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  185. def toJSON: String
    Definition Classes
    TreeNode
  186. def toString(): String
    Definition Classes
    TreeNode → AnyRef → Any
  187. def transform(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode
  188. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  189. def transformAllExpressionsWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    AnalysisHelper → QueryPlan
  190. def transformAllExpressionsWithSubqueries(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  191. def transformDown(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode
  192. def transformDownWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper → TreeNode
  193. def transformDownWithSubqueries(f: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    QueryPlan
  194. def transformDownWithSubqueriesAndPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(f: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    QueryPlan
  195. def transformExpressions(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  196. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  197. def transformExpressionsDownWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  198. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  199. def transformExpressionsUpWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  200. def transformExpressionsWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): WriteIntoDelta.this.type
    Definition Classes
    QueryPlan
  201. def transformUp(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode
  202. def transformUpWithBeforeAndAfterRuleOnChildren(cond: (LogicalPlan) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[(LogicalPlan, LogicalPlan), LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode
  203. def transformUpWithNewOutput(rule: PartialFunction[LogicalPlan, (LogicalPlan, Seq[(Attribute, Attribute)])], skipCond: (LogicalPlan) ⇒ Boolean, canGetOutput: (LogicalPlan) ⇒ Boolean): LogicalPlan
    Definition Classes
    AnalysisHelper → QueryPlan
  204. def transformUpWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper → TreeNode
  205. def transformUpWithSubqueries(f: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    QueryPlan
  206. def transformWithPruning(cond: (TreePatternBits) ⇒ Boolean, ruleId: RuleId)(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode
  207. def transformWithSubqueries(f: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    QueryPlan
  208. lazy val treePatternBits: BitSet
    Definition Classes
    QueryPlan → TreeNode → TreePatternBits
  209. def treeString(append: (String) ⇒ Unit, verbose: Boolean, addSuffix: Boolean, maxFields: Int, printOperatorId: Boolean): Unit
    Definition Classes
    TreeNode
  210. final def treeString(verbose: Boolean, addSuffix: Boolean, maxFields: Int, printOperatorId: Boolean): String
    Definition Classes
    TreeNode
  211. final def treeString: String
    Definition Classes
    TreeNode
  212. def unsetTagValue[T](tag: TreeNodeTag[T]): Unit
    Definition Classes
    TreeNode
  213. final def updateMetadata(spark: SparkSession, txn: OptimisticTransaction, schema: StructType, partitionColumns: Seq[String], configuration: Map[String, String], isOverwriteMode: Boolean, rearrangeOnly: Boolean): Unit
    Attributes
    protected
    Definition Classes
    ImplicitMetadataOperation
  214. def updateOuterReferencesInSubquery(plan: LogicalPlan, attrMap: AttributeMap[Attribute]): LogicalPlan
    Definition Classes
    AnalysisHelper → QueryPlan
  215. lazy val validConstraints: ExpressionSet
    Attributes
    protected
    Definition Classes
    QueryPlanConstraints
  216. def verboseString(maxFields: Int): String
    Definition Classes
    QueryPlan → TreeNode
  217. def verboseStringWithOperatorId(): String
    Definition Classes
    QueryPlan
  218. def verboseStringWithSuffix(maxFields: Int): String
    Definition Classes
    LogicalPlan → TreeNode
  219. def verifyPartitionPredicates(spark: SparkSession, partitionColumns: Seq[String], predicates: Seq[Expression]): Unit
    Definition Classes
    DeltaCommand
  220. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  221. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  222. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  223. final def withNewChildren(newChildren: Seq[LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode
  224. def withNewChildrenInternal(newChildren: IndexedSeq[LogicalPlan]): LogicalPlan
    Definition Classes
    LeafLike
  225. def withNewWriterConfiguration(updatedConfiguration: Map[String, String]): WriteIntoDeltaLike

    A helper method to create a new instances of WriteIntoDeltaLike with updated configuration.

    A helper method to create a new instances of WriteIntoDeltaLike with updated configuration.

    Definition Classes
    WriteIntoDeltaWriteIntoDeltaLike
  226. def withStatusCode[T](statusCode: String, defaultMessage: String, data: Map[String, Any] = Map.empty)(body: ⇒ T): T

    Report a log to indicate some command is running.

    Report a log to indicate some command is running.

    Definition Classes
    DeltaProgressReporter
  227. def write(txn: OptimisticTransaction, sparkSession: SparkSession, clusterBySpecOpt: Option[ClusterBySpec] = None, isTableReplace: Boolean = false): Seq[Action]
    Definition Classes
    WriteIntoDeltaLike
  228. def writeAndReturnCommitData(txn: OptimisticTransaction, sparkSession: SparkSession, clusterBySpecOpt: Option[ClusterBySpec] = None, isTableReplace: Boolean = false): TaggedCommitData[Action]

    Write data into Delta table as part of txn and @return the actions to be committed.

    Write data into Delta table as part of txn and @return the actions to be committed.

    Definition Classes
    WriteIntoDeltaWriteIntoDeltaLike

Inherited from Serializable

Inherited from Serializable

Inherited from WriteIntoDeltaLike

Inherited from DeltaCommand

Inherited from ImplicitMetadataOperation

Inherited from DeltaLogging

Inherited from DatabricksLogging

Inherited from DeltaProgressReporter

Inherited from LoggingShims

Inherited from LeafRunnableCommand

Inherited from LeafLike[LogicalPlan]

Inherited from RunnableCommand

Inherited from Command

Inherited from LogicalPlan

Inherited from Logging

Inherited from QueryPlanConstraints

Inherited from ConstraintHelper

Inherited from LogicalPlanDistinctKeys

Inherited from LogicalPlanStats

Inherited from AnalysisHelper

Inherited from QueryPlan[LogicalPlan]

Inherited from SQLConfHelper

Inherited from TreeNode[LogicalPlan]

Inherited from WithOrigin

Inherited from TreePatternBits

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped