Packages

object VacuumCommand extends VacuumCommandImpl with Serializable

Vacuums the table by clearing all untracked files and folders within this table. First lists all the files and directories in the table, and gets the relative paths with respect to the base of the table. Then it gets the list of all tracked files for this table, which may or may not be within the table base path, and gets the relative paths of all the tracked files with respect to the base of the table. Files outside of the table path will be ignored. Then we take a diff of the files and delete directories that were already empty, and all files that are within the table that are no longer tracked.

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. VacuumCommand
  2. Serializable
  3. Serializable
  4. VacuumCommandImpl
  5. DeltaCommand
  6. DeltaLogging
  7. DatabricksLogging
  8. DeltaProgressReporter
  9. LoggingShims
  10. Logging
  11. AnyRef
  12. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. implicit class LogStringContext extends AnyRef
    Definition Classes
    LoggingShims
  2. case class FileNameAndSize(path: String, length: Long) extends Product with Serializable

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. val INVENTORY_SCHEMA: StructType

    path : fully qualified uri length: size in bytes isDir: boolean indicating if it is a directory modificationTime: file update time in milliseconds

  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def buildBaseRelation(spark: SparkSession, txn: OptimisticTransaction, actionType: String, rootPath: Path, inputLeafFiles: Seq[String], nameToAddFileMap: Map[String, AddFile]): HadoopFsRelation

    Build a base relation of files that need to be rewritten as part of an update/delete/merge operation.

    Build a base relation of files that need to be rewritten as part of an update/delete/merge operation.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  7. def checkRetentionPeriodSafety(spark: SparkSession, retentionMs: Option[Long], configuredRetention: Long): Unit

    Additional check on retention duration to prevent people from shooting themselves in the foot.

    Additional check on retention duration to prevent people from shooting themselves in the foot.

    Attributes
    protected
  8. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  9. def createSetTransaction(sparkSession: SparkSession, deltaLog: DeltaLog, options: Option[DeltaOptions] = None): Option[SetTransaction]

    Returns SetTransaction if a valid app ID and version are present.

    Returns SetTransaction if a valid app ID and version are present. Otherwise returns an empty list.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  10. def delete(diff: Dataset[String], spark: SparkSession, basePath: String, hadoopConf: Broadcast[SerializableConfiguration], parallel: Boolean, parallelPartitions: Int): Long

    Attempts to delete the list of candidate files.

    Attempts to delete the list of candidate files. Returns the number of files deleted.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  11. def deltaAssert(check: ⇒ Boolean, name: String, msg: String, deltaLog: DeltaLog = null, data: AnyRef = null, path: Option[Path] = None): Unit

    Helper method to check invariants in Delta code.

    Helper method to check invariants in Delta code. Fails when running in tests, records a delta assertion event and logs a warning otherwise.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  12. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  14. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. def gc(spark: SparkSession, deltaLog: DeltaLog, dryRun: Boolean = true, retentionHours: Option[Double] = None, inventory: Option[DataFrame] = None, vacuumTypeOpt: Option[String] = None, commandMetrics: Map[String, SQLMetric] = Map.empty, clock: Clock = new SystemClock): DataFrame

    Clears all untracked files and folders within this table.

    Clears all untracked files and folders within this table. If the inventory is not provided then the command first lists all the files and directories in the table, if inventory is provided then it will be used for identifying files and directories within the table and gets the relative paths with respect to the base of the table. Then the command gets the list of all tracked files for this table, which may or may not be within the table base path, and gets the relative paths of all the tracked files with respect to the base of the table. Files outside of the table path will be ignored. Then we take a diff of the files and delete directories that were already empty, and all files that are within the table that are no longer tracked.

    dryRun

    If set to true, no files will be deleted. Instead, we will list all files and directories that will be cleared.

    retentionHours

    An optional parameter to override the default Delta tombstone retention period

    inventory

    An optional dataframe of files and directories within the table generated from sources like blob store inventory report

    returns

    A Dataset containing the paths of the files/folders to delete in dryRun mode. Otherwise returns the base path of the table.

  16. def generateCandidateFileMap(basePath: Path, candidateFiles: Seq[AddFile]): Map[String, AddFile]

    Generates a map of file names to add file entries for operations where we will need to rewrite files such as delete, merge, update.

    Generates a map of file names to add file entries for operations where we will need to rewrite files such as delete, merge, update. We expect file names to be unique, because each file contains a UUID.

    Definition Classes
    DeltaCommand
  17. def getActionRelativePath(action: FileAction, fs: FileSystem, basePath: Path, relativizeIgnoreError: Boolean): Option[String]
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  18. def getAllSubdirs(base: String, file: String, fs: FileSystem): Iterator[String]

    Wrapper function for DeltaFileOperations.getAllSubDirectories returns all subdirectories that file has with respect to base.

    Wrapper function for DeltaFileOperations.getAllSubDirectories returns all subdirectories that file has with respect to base.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  19. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  20. def getCommonTags(deltaLog: DeltaLog, tahoeId: String): Map[TagDefinition, String]
    Definition Classes
    DeltaLogging
  21. def getDeletionVectorRelativePathAndSize(action: FileAction): Option[(String, Long)]

    Returns the path of the on-disk deletion vector if it is stored relative to the basePath and it's size otherwise None.

    Returns the path of the on-disk deletion vector if it is stored relative to the basePath and it's size otherwise None.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  22. def getDeltaLog(spark: SparkSession, path: Option[String], tableIdentifier: Option[TableIdentifier], operationName: String, hadoopConf: Map[String, String] = Map.empty): DeltaLog

    Utility method to return the DeltaLog of an existing Delta table referred by either the given path or tableIdentifier.

    Utility method to return the DeltaLog of an existing Delta table referred by either the given path or tableIdentifier.

    spark

    SparkSession reference to use.

    path

    Table location. Expects a non-empty tableIdentifier or path.

    tableIdentifier

    Table identifier. Expects a non-empty tableIdentifier or path.

    operationName

    Operation that is getting the DeltaLog, used in error messages.

    hadoopConf

    Hadoop file system options used to build DeltaLog.

    returns

    DeltaLog of the table

    Attributes
    protected
    Definition Classes
    DeltaCommand
    Exceptions thrown

    AnalysisException If either no Delta table exists at the given path/identifier or there is neither path nor tableIdentifier is provided.

  23. def getDeltaTable(target: LogicalPlan, cmd: String): DeltaTableV2

    Extracts the DeltaTableV2 from a LogicalPlan iff the LogicalPlan is a ResolvedTable with either a DeltaTableV2 or a V1Table that is referencing a Delta table.

    Extracts the DeltaTableV2 from a LogicalPlan iff the LogicalPlan is a ResolvedTable with either a DeltaTableV2 or a V1Table that is referencing a Delta table. In all other cases this method will throw a "Table not found" exception.

    Definition Classes
    DeltaCommand
  24. def getDeltaTablePathOrIdentifier(target: LogicalPlan, cmd: String): (Option[TableIdentifier], Option[String])

    Helper method to extract the table id or path from a LogicalPlan representing a Delta table.

    Helper method to extract the table id or path from a LogicalPlan representing a Delta table. This uses DeltaCommand.getDeltaTable to convert the LogicalPlan to a DeltaTableV2 and then extracts either the path or identifier from it. If the DeltaTableV2 has a CatalogTable, the table identifier will be returned. Otherwise, the table's path will be returned. Throws an exception if the LogicalPlan does not represent a Delta table.

    Definition Classes
    DeltaCommand
  25. def getErrorData(e: Throwable): Map[String, Any]
    Definition Classes
    DeltaLogging
  26. def getFilesFromDeltaLog(spark: SparkSession, deltaLog: DeltaLog, basePath: String, hadoopConf: Broadcast[SerializableConfiguration], eligibleStartCommitVersion: Long, eligibleEndCommitVersion: Long): Dataset[SerializableFileStatus]

    Returns eligible files to be deleted by looking at the delta log given the start and the end commit versions.

    Returns eligible files to be deleted by looking at the delta log given the start and the end commit versions.

    Attributes
    protected
  27. def getFilesFromDeltaLog(spark: SparkSession, snapshot: Snapshot, basePath: String, hadoopConf: Broadcast[SerializableConfiguration], latestCommitVersionOutsideOfRetentionWindowOpt: Option[Long]): (Dataset[SerializableFileStatus], Option[Long], Option[Long])

    Returns eligible files to be deleted by looking at the delta log.

    Returns eligible files to be deleted by looking at the delta log. Additionally, it returns the start and the end commit versions(inclusive) which give us the eligible files to be deleted.

    Attributes
    protected
  28. def getFilesFromInventory(basePath: String, partitionColumns: Seq[String], inventory: DataFrame, shouldIcebergMetadataDirBeHidden: Boolean): Dataset[SerializableFileStatus]
  29. def getRelativePath(path: String, fs: FileSystem, basePath: Path, relativizeIgnoreError: Boolean): Option[String]

    Returns the relative path of a file or None if the file lives outside of the table.

    Returns the relative path of a file or None if the file lives outside of the table.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  30. def getTableCatalogTable(target: LogicalPlan, cmd: String): Option[CatalogTable]

    Extracts CatalogTable metadata from a LogicalPlan if the plan is a ResolvedTable.

    Extracts CatalogTable metadata from a LogicalPlan if the plan is a ResolvedTable. The table can be a non delta table.

    Definition Classes
    DeltaCommand
  31. def getTablePathOrIdentifier(target: LogicalPlan, cmd: String): (Option[TableIdentifier], Option[String])

    Helper method to extract the table id or path from a LogicalPlan representing a resolved table or path.

    Helper method to extract the table id or path from a LogicalPlan representing a resolved table or path. This calls getDeltaTablePathOrIdentifier if the resolved table is a delta table. For non delta table with identifier, we extract its identifier. For non delta table with path, it expects the path to be wrapped in an ResolvedPathBasedNonDeltaTable and extracts it from there.

    Definition Classes
    DeltaCommand
  32. def getTouchedFile(basePath: Path, escapedFilePath: String, nameToAddFileMap: Map[String, AddFile]): AddFile

    Find the AddFile record corresponding to the file that was read as part of a delete/update/merge operation.

    Find the AddFile record corresponding to the file that was read as part of a delete/update/merge operation.

    basePath

    The path of the table. Must not be escaped.

    escapedFilePath

    The path to a file that can be either absolute or relative. All special chars in this path must be already escaped by URI standards.

    nameToAddFileMap

    Map generated through generateCandidateFileMap().

    Definition Classes
    DeltaCommand
  33. def getValidRelativePathsAndSubdirs(action: FileAction, fs: FileSystem, basePath: Path, relativizeIgnoreError: Boolean): Seq[String]

    Returns the relative paths of all files and subdirectories for this action that must be retained during GC.

    Returns the relative paths of all files and subdirectories for this action that must be retained during GC.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  34. def hasBeenExecuted(txn: OptimisticTransaction, sparkSession: SparkSession, options: Option[DeltaOptions] = None): Boolean

    Returns true if there is information in the spark session that indicates that this write has already been successfully written.

    Returns true if there is information in the spark session that indicates that this write has already been successfully written.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  35. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  36. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  37. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def isCatalogTable(analyzer: Analyzer, tableIdent: TableIdentifier): Boolean

    Use the analyzer to see whether the provided TableIdentifier is for a path based table or not

    Use the analyzer to see whether the provided TableIdentifier is for a path based table or not

    analyzer

    The session state analyzer to call

    tableIdent

    Table Identifier to determine whether is path based or not

    returns

    Boolean where true means that the table is a table in a metastore and false means the table is a path based table

    Definition Classes
    DeltaCommand
  39. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  40. def isPathIdentifier(tableIdent: TableIdentifier): Boolean

    Checks if the given identifier can be for a delta table's path

    Checks if the given identifier can be for a delta table's path

    tableIdent

    Table Identifier for which to check

    Attributes
    protected
    Definition Classes
    DeltaCommand
  41. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  42. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  43. def logConsole(line: String): Unit
    Definition Classes
    DatabricksLogging
  44. def logDebug(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  45. def logDebug(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  46. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  47. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  48. def logError(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  49. def logError(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  50. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  51. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  52. def logInfo(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  53. def logInfo(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  54. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  55. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  56. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  57. def logTrace(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  58. def logTrace(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  59. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  60. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  61. def logVacuumEnd(deltaLog: DeltaLog, spark: SparkSession, path: Path, commandMetrics: Map[String, SQLMetric], filesDeleted: Option[Long] = None, dirCounts: Option[Long] = None): Unit

    Record Vacuum specific metrics in the commit log at the END of vacuum.

    Record Vacuum specific metrics in the commit log at the END of vacuum.

    deltaLog

    - DeltaLog of the table

    spark

    - spark session

    path

    - the (data) path to the root of the table

    filesDeleted

    - if the vacuum completed this will contain the number of files deleted. if the vacuum failed, this will be None.

    dirCounts

    - if the vacuum completed this will contain the number of directories vacuumed. if the vacuum failed, this will be None.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  62. def logVacuumStart(spark: SparkSession, deltaLog: DeltaLog, path: Path, diff: Dataset[String], sizeOfDataToDelete: Long, specifiedRetentionMillis: Option[Long], defaultRetentionMillis: Long): Unit

    Record Vacuum specific metrics in the commit log at the START of vacuum.

    Record Vacuum specific metrics in the commit log at the START of vacuum.

    spark

    - spark session

    deltaLog

    - DeltaLog of the table

    path

    - the (data) path to the root of the table

    diff

    - the list of paths (files, directories) that are safe to delete

    sizeOfDataToDelete

    - the amount of data (bytes) to be deleted

    specifiedRetentionMillis

    - the optional override retention period (millis) to keep logically removed files before deleting them

    defaultRetentionMillis

    - the default retention period (millis)

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  63. def logWarning(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  64. def logWarning(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  65. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  66. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  67. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  68. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  69. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  70. def parsePredicates(spark: SparkSession, predicate: String): Seq[Expression]

    Converts string predicates into Expressions relative to a transaction.

    Converts string predicates into Expressions relative to a transaction.

    Attributes
    protected
    Definition Classes
    DeltaCommand
    Exceptions thrown

    AnalysisException if a non-partition column is referenced.

  71. def pathStringtoUrlEncodedString(path: String): String
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  72. def pathToUrlEncodedString(path: Path): String
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  73. def recordDeltaEvent(deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty, data: AnyRef = null, path: Option[Path] = None): Unit

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    path

    Used to log the path of the delta table when deltaLog is null.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  74. def recordDeltaOperation[A](deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  75. def recordDeltaOperationForTablePath[A](tablePath: String, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  76. def recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  77. def recordFrameProfile[T](group: String, name: String)(thunk: ⇒ T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  78. def recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = METRIC_OPERATION_DURATION, silent: Boolean = true)(thunk: ⇒ S): S
    Definition Classes
    DatabricksLogging
  79. def recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  80. def recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  81. def recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  82. def relativize(path: Path, fs: FileSystem, reservoirBase: Path, isDir: Boolean): String

    Attempts to relativize the path with respect to the reservoirBase and converts the path to a string.

    Attempts to relativize the path with respect to the reservoirBase and converts the path to a string.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  83. def removeFilesFromPaths(deltaLog: DeltaLog, nameToAddFileMap: Map[String, AddFile], filesToRewrite: Seq[String], operationTimestamp: Long): Seq[RemoveFile]

    This method provides the RemoveFile actions that are necessary for files that are touched and need to be rewritten in methods like Delete, Update, and Merge.

    This method provides the RemoveFile actions that are necessary for files that are touched and need to be rewritten in methods like Delete, Update, and Merge.

    deltaLog

    The DeltaLog of the table that is being operated on

    nameToAddFileMap

    A map generated using generateCandidateFileMap.

    filesToRewrite

    Absolute paths of the files that were touched. We will search for these in candidateFiles. Obtained as the output of the input_file_name function.

    operationTimestamp

    The timestamp of the operation

    Attributes
    protected
    Definition Classes
    DeltaCommand
  84. def resolveIdentifier(analyzer: Analyzer, identifier: TableIdentifier): LogicalPlan

    Use the analyzer to resolve the identifier provided

    Use the analyzer to resolve the identifier provided

    analyzer

    The session state analyzer to call

    identifier

    Table Identifier to determine whether is path based or not

    Attributes
    protected
    Definition Classes
    DeltaCommand
  85. def sendDriverMetrics(spark: SparkSession, metrics: Map[String, SQLMetric]): Unit

    Send the driver-side metrics.

    Send the driver-side metrics.

    This is needed to make the SQL metrics visible in the Spark UI. All metrics are default initialized with 0 so that's what we're reporting in case we skip an already executed action.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  86. def setCommitClock(deltaLog: DeltaLog, version: Long): Unit
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  87. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  88. def toString(): String
    Definition Classes
    AnyRef → Any
  89. def urlEncodedStringToPath(path: String): Path
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  90. def verifyPartitionPredicates(spark: SparkSession, partitionColumns: Seq[String], predicates: Seq[Expression]): Unit
    Definition Classes
    DeltaCommand
  91. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  92. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  93. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  94. def withStatusCode[T](statusCode: String, defaultMessage: String, data: Map[String, Any] = Map.empty)(body: ⇒ T): T

    Report a log to indicate some command is running.

    Report a log to indicate some command is running.

    Definition Classes
    DeltaProgressReporter
  95. object VacuumType extends Enumeration

Inherited from Serializable

Inherited from Serializable

Inherited from VacuumCommandImpl

Inherited from DeltaCommand

Inherited from DeltaLogging

Inherited from DatabricksLogging

Inherited from DeltaProgressReporter

Inherited from LoggingShims

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped