object VacuumCommand extends VacuumCommandImpl with Serializable
Vacuums the table by clearing all untracked files and folders within this table. First lists all the files and directories in the table, and gets the relative paths with respect to the base of the table. Then it gets the list of all tracked files for this table, which may or may not be within the table base path, and gets the relative paths of all the tracked files with respect to the base of the table. Files outside of the table path will be ignored. Then we take a diff of the files and delete directories that were already empty, and all files that are within the table that are no longer tracked.
- Alphabetic
- By Inheritance
- VacuumCommand
- Serializable
- Serializable
- VacuumCommandImpl
- DeltaCommand
- DeltaLogging
- DatabricksLogging
- DeltaProgressReporter
- LoggingShims
- Logging
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
implicit
class
LogStringContext extends AnyRef
- Definition Classes
- LoggingShims
- case class FileNameAndSize(path: String, length: Long) extends Product with Serializable
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
val
INVENTORY_SCHEMA: StructType
path : fully qualified uri length: size in bytes isDir: boolean indicating if it is a directory modificationTime: file update time in milliseconds
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
buildBaseRelation(spark: SparkSession, txn: OptimisticTransaction, actionType: String, rootPath: Path, inputLeafFiles: Seq[String], nameToAddFileMap: Map[String, AddFile]): HadoopFsRelation
Build a base relation of files that need to be rewritten as part of an update/delete/merge operation.
Build a base relation of files that need to be rewritten as part of an update/delete/merge operation.
- Attributes
- protected
- Definition Classes
- DeltaCommand
-
def
checkRetentionPeriodSafety(spark: SparkSession, retentionMs: Option[Long], configuredRetention: Long): Unit
Additional check on retention duration to prevent people from shooting themselves in the foot.
Additional check on retention duration to prevent people from shooting themselves in the foot.
- Attributes
- protected
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
createSetTransaction(sparkSession: SparkSession, deltaLog: DeltaLog, options: Option[DeltaOptions] = None): Option[SetTransaction]
Returns SetTransaction if a valid app ID and version are present.
Returns SetTransaction if a valid app ID and version are present. Otherwise returns an empty list.
- Attributes
- protected
- Definition Classes
- DeltaCommand
-
def
delete(diff: Dataset[String], spark: SparkSession, basePath: String, hadoopConf: Broadcast[SerializableConfiguration], parallel: Boolean, parallelPartitions: Int): Long
Attempts to delete the list of candidate files.
Attempts to delete the list of candidate files. Returns the number of files deleted.
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
deltaAssert(check: ⇒ Boolean, name: String, msg: String, deltaLog: DeltaLog = null, data: AnyRef = null, path: Option[Path] = None): Unit
Helper method to check invariants in Delta code.
Helper method to check invariants in Delta code. Fails when running in tests, records a delta assertion event and logs a warning otherwise.
- Attributes
- protected
- Definition Classes
- DeltaLogging
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
def
gc(spark: SparkSession, deltaLog: DeltaLog, dryRun: Boolean = true, retentionHours: Option[Double] = None, inventory: Option[DataFrame] = None, vacuumTypeOpt: Option[String] = None, commandMetrics: Map[String, SQLMetric] = Map.empty, clock: Clock = new SystemClock): DataFrame
Clears all untracked files and folders within this table.
Clears all untracked files and folders within this table. If the inventory is not provided then the command first lists all the files and directories in the table, if inventory is provided then it will be used for identifying files and directories within the table and gets the relative paths with respect to the base of the table. Then the command gets the list of all tracked files for this table, which may or may not be within the table base path, and gets the relative paths of all the tracked files with respect to the base of the table. Files outside of the table path will be ignored. Then we take a diff of the files and delete directories that were already empty, and all files that are within the table that are no longer tracked.
- dryRun
If set to true, no files will be deleted. Instead, we will list all files and directories that will be cleared.
- retentionHours
An optional parameter to override the default Delta tombstone retention period
- inventory
An optional dataframe of files and directories within the table generated from sources like blob store inventory report
- returns
A Dataset containing the paths of the files/folders to delete in dryRun mode. Otherwise returns the base path of the table.
-
def
generateCandidateFileMap(basePath: Path, candidateFiles: Seq[AddFile]): Map[String, AddFile]
Generates a map of file names to add file entries for operations where we will need to rewrite files such as delete, merge, update.
Generates a map of file names to add file entries for operations where we will need to rewrite files such as delete, merge, update. We expect file names to be unique, because each file contains a UUID.
- Definition Classes
- DeltaCommand
-
def
getActionRelativePath(action: FileAction, fs: FileSystem, basePath: Path, relativizeIgnoreError: Boolean): Option[String]
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
getAllSubdirs(base: String, file: String, fs: FileSystem): Iterator[String]
Wrapper function for DeltaFileOperations.getAllSubDirectories returns all subdirectories that
filehas with respect tobase.Wrapper function for DeltaFileOperations.getAllSubDirectories returns all subdirectories that
filehas with respect tobase.- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
getCommonTags(deltaLog: DeltaLog, tahoeId: String): Map[TagDefinition, String]
- Definition Classes
- DeltaLogging
-
def
getDeletionVectorRelativePathAndSize(action: FileAction): Option[(String, Long)]
Returns the path of the on-disk deletion vector if it is stored relative to the
basePathand it's size otherwiseNone.Returns the path of the on-disk deletion vector if it is stored relative to the
basePathand it's size otherwiseNone.- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
getDeltaLog(spark: SparkSession, path: Option[String], tableIdentifier: Option[TableIdentifier], operationName: String, hadoopConf: Map[String, String] = Map.empty): DeltaLog
Utility method to return the DeltaLog of an existing Delta table referred by either the given path or tableIdentifier.
Utility method to return the DeltaLog of an existing Delta table referred by either the given path or tableIdentifier.
- spark
SparkSession reference to use.
- path
Table location. Expects a non-empty tableIdentifier or path.
- tableIdentifier
Table identifier. Expects a non-empty tableIdentifier or path.
- operationName
Operation that is getting the DeltaLog, used in error messages.
- hadoopConf
Hadoop file system options used to build DeltaLog.
- returns
DeltaLog of the table
- Attributes
- protected
- Definition Classes
- DeltaCommand
- Exceptions thrown
AnalysisExceptionIf either no Delta table exists at the given path/identifier or there is neither path nor tableIdentifier is provided.
-
def
getDeltaTable(target: LogicalPlan, cmd: String): DeltaTableV2
Extracts the DeltaTableV2 from a LogicalPlan iff the LogicalPlan is a ResolvedTable with either a DeltaTableV2 or a V1Table that is referencing a Delta table.
Extracts the DeltaTableV2 from a LogicalPlan iff the LogicalPlan is a ResolvedTable with either a DeltaTableV2 or a V1Table that is referencing a Delta table. In all other cases this method will throw a "Table not found" exception.
- Definition Classes
- DeltaCommand
-
def
getDeltaTablePathOrIdentifier(target: LogicalPlan, cmd: String): (Option[TableIdentifier], Option[String])
Helper method to extract the table id or path from a LogicalPlan representing a Delta table.
Helper method to extract the table id or path from a LogicalPlan representing a Delta table. This uses DeltaCommand.getDeltaTable to convert the LogicalPlan to a DeltaTableV2 and then extracts either the path or identifier from it. If the DeltaTableV2 has a CatalogTable, the table identifier will be returned. Otherwise, the table's path will be returned. Throws an exception if the LogicalPlan does not represent a Delta table.
- Definition Classes
- DeltaCommand
-
def
getErrorData(e: Throwable): Map[String, Any]
- Definition Classes
- DeltaLogging
-
def
getFilesFromDeltaLog(spark: SparkSession, deltaLog: DeltaLog, basePath: String, hadoopConf: Broadcast[SerializableConfiguration], eligibleStartCommitVersion: Long, eligibleEndCommitVersion: Long): Dataset[SerializableFileStatus]
Returns eligible files to be deleted by looking at the delta log given the start and the end commit versions.
Returns eligible files to be deleted by looking at the delta log given the start and the end commit versions.
- Attributes
- protected
-
def
getFilesFromDeltaLog(spark: SparkSession, snapshot: Snapshot, basePath: String, hadoopConf: Broadcast[SerializableConfiguration], latestCommitVersionOutsideOfRetentionWindowOpt: Option[Long]): (Dataset[SerializableFileStatus], Option[Long], Option[Long])
Returns eligible files to be deleted by looking at the delta log.
Returns eligible files to be deleted by looking at the delta log. Additionally, it returns the start and the end commit versions(inclusive) which give us the eligible files to be deleted.
- Attributes
- protected
- def getFilesFromInventory(basePath: String, partitionColumns: Seq[String], inventory: DataFrame, shouldIcebergMetadataDirBeHidden: Boolean): Dataset[SerializableFileStatus]
-
def
getRelativePath(path: String, fs: FileSystem, basePath: Path, relativizeIgnoreError: Boolean): Option[String]
Returns the relative path of a file or None if the file lives outside of the table.
Returns the relative path of a file or None if the file lives outside of the table.
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
getTableCatalogTable(target: LogicalPlan, cmd: String): Option[CatalogTable]
Extracts CatalogTable metadata from a LogicalPlan if the plan is a ResolvedTable.
Extracts CatalogTable metadata from a LogicalPlan if the plan is a ResolvedTable. The table can be a non delta table.
- Definition Classes
- DeltaCommand
-
def
getTablePathOrIdentifier(target: LogicalPlan, cmd: String): (Option[TableIdentifier], Option[String])
Helper method to extract the table id or path from a LogicalPlan representing a resolved table or path.
Helper method to extract the table id or path from a LogicalPlan representing a resolved table or path. This calls getDeltaTablePathOrIdentifier if the resolved table is a delta table. For non delta table with identifier, we extract its identifier. For non delta table with path, it expects the path to be wrapped in an ResolvedPathBasedNonDeltaTable and extracts it from there.
- Definition Classes
- DeltaCommand
-
def
getTouchedFile(basePath: Path, escapedFilePath: String, nameToAddFileMap: Map[String, AddFile]): AddFile
Find the AddFile record corresponding to the file that was read as part of a delete/update/merge operation.
Find the AddFile record corresponding to the file that was read as part of a delete/update/merge operation.
- basePath
The path of the table. Must not be escaped.
- escapedFilePath
The path to a file that can be either absolute or relative. All special chars in this path must be already escaped by URI standards.
- nameToAddFileMap
Map generated through
generateCandidateFileMap().
- Definition Classes
- DeltaCommand
-
def
getValidRelativePathsAndSubdirs(action: FileAction, fs: FileSystem, basePath: Path, relativizeIgnoreError: Boolean): Seq[String]
Returns the relative paths of all files and subdirectories for this action that must be retained during GC.
Returns the relative paths of all files and subdirectories for this action that must be retained during GC.
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
hasBeenExecuted(txn: OptimisticTransaction, sparkSession: SparkSession, options: Option[DeltaOptions] = None): Boolean
Returns true if there is information in the spark session that indicates that this write has already been successfully written.
Returns true if there is information in the spark session that indicates that this write has already been successfully written.
- Attributes
- protected
- Definition Classes
- DeltaCommand
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
initializeLogIfNecessary(isInterpreter: Boolean): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
isCatalogTable(analyzer: Analyzer, tableIdent: TableIdentifier): Boolean
Use the analyzer to see whether the provided TableIdentifier is for a path based table or not
Use the analyzer to see whether the provided TableIdentifier is for a path based table or not
- analyzer
The session state analyzer to call
- tableIdent
Table Identifier to determine whether is path based or not
- returns
Boolean where true means that the table is a table in a metastore and false means the table is a path based table
- Definition Classes
- DeltaCommand
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
isPathIdentifier(tableIdent: TableIdentifier): Boolean
Checks if the given identifier can be for a delta table's path
Checks if the given identifier can be for a delta table's path
- tableIdent
Table Identifier for which to check
- Attributes
- protected
- Definition Classes
- DeltaCommand
-
def
isTraceEnabled(): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
log: Logger
- Attributes
- protected
- Definition Classes
- Logging
-
def
logConsole(line: String): Unit
- Definition Classes
- DatabricksLogging
-
def
logDebug(entry: LogEntry, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logDebug(entry: LogEntry): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logDebug(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(entry: LogEntry, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logError(entry: LogEntry): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logError(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(entry: LogEntry, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logInfo(entry: LogEntry): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logInfo(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logName: String
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(entry: LogEntry, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logTrace(entry: LogEntry): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logTrace(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logVacuumEnd(deltaLog: DeltaLog, spark: SparkSession, path: Path, commandMetrics: Map[String, SQLMetric], filesDeleted: Option[Long] = None, dirCounts: Option[Long] = None): Unit
Record Vacuum specific metrics in the commit log at the END of vacuum.
Record Vacuum specific metrics in the commit log at the END of vacuum.
- deltaLog
- DeltaLog of the table
- spark
- spark session
- path
- the (data) path to the root of the table
- filesDeleted
- if the vacuum completed this will contain the number of files deleted. if the vacuum failed, this will be None.
- dirCounts
- if the vacuum completed this will contain the number of directories vacuumed. if the vacuum failed, this will be None.
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
logVacuumStart(spark: SparkSession, deltaLog: DeltaLog, path: Path, diff: Dataset[String], sizeOfDataToDelete: Long, specifiedRetentionMillis: Option[Long], defaultRetentionMillis: Long): Unit
Record Vacuum specific metrics in the commit log at the START of vacuum.
Record Vacuum specific metrics in the commit log at the START of vacuum.
- spark
- spark session
- deltaLog
- DeltaLog of the table
- path
- the (data) path to the root of the table
- diff
- the list of paths (files, directories) that are safe to delete
- sizeOfDataToDelete
- the amount of data (bytes) to be deleted
- specifiedRetentionMillis
- the optional override retention period (millis) to keep logically removed files before deleting them
- defaultRetentionMillis
- the default retention period (millis)
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
logWarning(entry: LogEntry, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logWarning(entry: LogEntry): Unit
- Attributes
- protected
- Definition Classes
- LoggingShims
-
def
logWarning(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
parsePredicates(spark: SparkSession, predicate: String): Seq[Expression]
Converts string predicates into Expressions relative to a transaction.
Converts string predicates into Expressions relative to a transaction.
- Attributes
- protected
- Definition Classes
- DeltaCommand
- Exceptions thrown
AnalysisExceptionif a non-partition column is referenced.
-
def
pathStringtoUrlEncodedString(path: String): String
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
pathToUrlEncodedString(path: Path): String
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
recordDeltaEvent(deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty, data: AnyRef = null, path: Option[Path] = None): Unit
Used to record the occurrence of a single event or report detailed, operation specific statistics.
Used to record the occurrence of a single event or report detailed, operation specific statistics.
- path
Used to log the path of the delta table when
deltaLogis null.
- Attributes
- protected
- Definition Classes
- DeltaLogging
-
def
recordDeltaOperation[A](deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A
Used to report the duration as well as the success or failure of an operation on a
deltaLog.Used to report the duration as well as the success or failure of an operation on a
deltaLog.- Attributes
- protected
- Definition Classes
- DeltaLogging
-
def
recordDeltaOperationForTablePath[A](tablePath: String, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A
Used to report the duration as well as the success or failure of an operation on a
tahoePath.Used to report the duration as well as the success or failure of an operation on a
tahoePath.- Attributes
- protected
- Definition Classes
- DeltaLogging
-
def
recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
- Definition Classes
- DatabricksLogging
-
def
recordFrameProfile[T](group: String, name: String)(thunk: ⇒ T): T
- Attributes
- protected
- Definition Classes
- DeltaLogging
-
def
recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = METRIC_OPERATION_DURATION, silent: Boolean = true)(thunk: ⇒ S): S
- Definition Classes
- DatabricksLogging
-
def
recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
- Definition Classes
- DatabricksLogging
-
def
recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
- Definition Classes
- DatabricksLogging
-
def
recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
- Definition Classes
- DatabricksLogging
-
def
relativize(path: Path, fs: FileSystem, reservoirBase: Path, isDir: Boolean): String
Attempts to relativize the
pathwith respect to thereservoirBaseand converts the path to a string.Attempts to relativize the
pathwith respect to thereservoirBaseand converts the path to a string.- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
removeFilesFromPaths(deltaLog: DeltaLog, nameToAddFileMap: Map[String, AddFile], filesToRewrite: Seq[String], operationTimestamp: Long): Seq[RemoveFile]
This method provides the RemoveFile actions that are necessary for files that are touched and need to be rewritten in methods like Delete, Update, and Merge.
This method provides the RemoveFile actions that are necessary for files that are touched and need to be rewritten in methods like Delete, Update, and Merge.
- deltaLog
The DeltaLog of the table that is being operated on
- nameToAddFileMap
A map generated using
generateCandidateFileMap.- filesToRewrite
Absolute paths of the files that were touched. We will search for these in
candidateFiles. Obtained as the output of theinput_file_namefunction.- operationTimestamp
The timestamp of the operation
- Attributes
- protected
- Definition Classes
- DeltaCommand
-
def
resolveIdentifier(analyzer: Analyzer, identifier: TableIdentifier): LogicalPlan
Use the analyzer to resolve the identifier provided
Use the analyzer to resolve the identifier provided
- analyzer
The session state analyzer to call
- identifier
Table Identifier to determine whether is path based or not
- Attributes
- protected
- Definition Classes
- DeltaCommand
-
def
sendDriverMetrics(spark: SparkSession, metrics: Map[String, SQLMetric]): Unit
Send the driver-side metrics.
Send the driver-side metrics.
This is needed to make the SQL metrics visible in the Spark UI. All metrics are default initialized with 0 so that's what we're reporting in case we skip an already executed action.
- Attributes
- protected
- Definition Classes
- DeltaCommand
-
def
setCommitClock(deltaLog: DeltaLog, version: Long): Unit
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
urlEncodedStringToPath(path: String): Path
- Attributes
- protected
- Definition Classes
- VacuumCommandImpl
-
def
verifyPartitionPredicates(spark: SparkSession, partitionColumns: Seq[String], predicates: Seq[Expression]): Unit
- Definition Classes
- DeltaCommand
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
withStatusCode[T](statusCode: String, defaultMessage: String, data: Map[String, Any] = Map.empty)(body: ⇒ T): T
Report a log to indicate some command is running.
Report a log to indicate some command is running.
- Definition Classes
- DeltaProgressReporter
- object VacuumType extends Enumeration