Packages

object DeltaLog extends DeltaLogging

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DeltaLog
  2. DeltaLogging
  3. DatabricksLogging
  4. DeltaProgressReporter
  5. LoggingShims
  6. Logging
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. implicit class LogStringContext extends AnyRef
    Definition Classes
    LoggingShims
  2. type CacheKey = (Path, Map[String, String])

    We create only a single DeltaLog for any given DeltaLogCacheKey to avoid wasted work in reconstructing the log.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def assertRemovable(snapshot: Snapshot): Unit

    Checks whether this table only accepts appends.

    Checks whether this table only accepts appends. If so it will throw an error in operations that can remove data such as DELETE/UPDATE/MERGE.

  6. def clearCache(): Unit
  7. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  8. def deltaAssert(check: ⇒ Boolean, name: String, msg: String, deltaLog: DeltaLog = null, data: AnyRef = null, path: Option[Path] = None): Unit

    Helper method to check invariants in Delta code.

    Helper method to check invariants in Delta code. Fails when running in tests, records a delta assertion event and logs a warning otherwise.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def filterFileList(partitionSchema: StructType, files: DataFrame, partitionFilters: Seq[Expression], partitionColumnPrefixes: Seq[String] = Nil, shouldRewritePartitionFilters: Boolean = true): DataFrame

    Filters the given Dataset by the given partitionFilters, returning those that match.

    Filters the given Dataset by the given partitionFilters, returning those that match.

    files

    The active files in the DeltaLog state, which contains the partition value information

    partitionFilters

    Filters on the partition columns

    partitionColumnPrefixes

    The path to the partitionValues column, if it's nested

    shouldRewritePartitionFilters

    Whether to rewrite partitionFilters to be over the AddFile schema

  12. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. def forTable(spark: SparkSession, table: CatalogTable, clock: Clock): DeltaLog

    Helper for creating a log for the table.

  14. def forTable(spark: SparkSession, table: CatalogTable, options: Map[String, String]): DeltaLog

    Helper for creating a log for the table.

  15. def forTable(spark: SparkSession, tableName: TableIdentifier, clock: Clock): DeltaLog

    Helper for creating a log for the table.

  16. def forTable(spark: SparkSession, table: CatalogTable): DeltaLog

    Helper for creating a log for the table.

  17. def forTable(spark: SparkSession, tableName: TableIdentifier): DeltaLog

    Helper for creating a log for the table.

  18. def forTable(spark: SparkSession, dataPath: Path, clock: Clock): DeltaLog

    Helper for creating a log when it stored at the root of the data.

  19. def forTable(spark: SparkSession, dataPath: Path, options: Map[String, String]): DeltaLog

    Helper for creating a log when it stored at the root of the data.

  20. def forTable(spark: SparkSession, dataPath: Path): DeltaLog

    Helper for creating a log when it stored at the root of the data.

  21. def forTable(spark: SparkSession, dataPath: String): DeltaLog

    Helper for creating a log when it stored at the root of the data.

  22. def forTableWithSnapshot(spark: SparkSession, table: CatalogTable, options: Map[String, String]): (DeltaLog, Snapshot)

    Helper for getting a log, as well as the latest snapshot, of the table

  23. def forTableWithSnapshot(spark: SparkSession, dataPath: Path, options: Map[String, String]): (DeltaLog, Snapshot)

    Helper for getting a log, as well as the latest snapshot, of the table

  24. def forTableWithSnapshot(spark: SparkSession, tableName: TableIdentifier): (DeltaLog, Snapshot)

    Helper for getting a log, as well as the latest snapshot, of the table

  25. def forTableWithSnapshot(spark: SparkSession, dataPath: Path): (DeltaLog, Snapshot)

    Helper for getting a log, as well as the latest snapshot, of the table

  26. def forTableWithSnapshot(spark: SparkSession, dataPath: String): (DeltaLog, Snapshot)

    Helper for getting a log, as well as the latest snapshot, of the table

  27. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  28. def getCommonTags(deltaLog: DeltaLog, tahoeId: String): Map[TagDefinition, String]
    Definition Classes
    DeltaLogging
  29. def getErrorData(e: Throwable): Map[String, Any]
    Definition Classes
    DeltaLogging
  30. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  31. def indexToRelation(spark: SparkSession, index: DeltaLogFileIndex, additionalOptions: Map[String, String], schema: StructType = Action.logSchema): LogicalRelation

    Creates a LogicalRelation for a given DeltaLogFileIndex, with all necessary file source options taken from the Delta Log.

    Creates a LogicalRelation for a given DeltaLogFileIndex, with all necessary file source options taken from the Delta Log. All reads of Delta metadata files should use this method.

  32. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  33. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def invalidateCache(spark: SparkSession, dataPath: Path): Unit

    Invalidate the cached DeltaLog object for the given dataPath.

  35. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  36. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  37. val jsonCommitParseOption: Map[String, String]
  38. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  39. def logConsole(line: String): Unit
    Definition Classes
    DatabricksLogging
  40. def logDebug(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  41. def logDebug(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  42. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  43. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  44. def logError(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  45. def logError(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  46. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  47. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  48. def logInfo(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  49. def logInfo(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  50. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  51. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  52. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  53. def logTrace(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  54. def logTrace(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  55. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  56. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  57. def logWarning(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  58. def logWarning(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    LoggingShims
  59. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  60. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  61. def minSetTransactionRetentionInterval(metadata: Metadata): Option[Long]

    How long to keep around SetTransaction actions before physically deleting them.

  62. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  63. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  64. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  65. def recordDeltaEvent(deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty, data: AnyRef = null, path: Option[Path] = None): Unit

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    path

    Used to log the path of the delta table when deltaLog is null.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  66. def recordDeltaOperation[A](deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  67. def recordDeltaOperationForTablePath[A](tablePath: String, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: ⇒ A): A

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  68. def recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  69. def recordFrameProfile[T](group: String, name: String)(thunk: ⇒ T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  70. def recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = METRIC_OPERATION_DURATION, silent: Boolean = true)(thunk: ⇒ S): S
    Definition Classes
    DatabricksLogging
  71. def recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  72. def recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  73. def recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  74. def rewritePartitionFilters(partitionSchema: StructType, resolver: Resolver, partitionFilters: Seq[Expression], partitionColumnPrefixes: Seq[String] = Nil): Seq[Expression]

    Rewrite the given partitionFilters to be used for filtering partition values.

    Rewrite the given partitionFilters to be used for filtering partition values. We need to explicitly resolve the partitioning columns here because the partition columns are stored as keys of a Map type instead of attributes in the AddFile schema (below) and thus cannot be resolved automatically.

    partitionFilters

    Filters on the partition columns

    partitionColumnPrefixes

    The path to the partitionValues column, if it's nested

  75. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  76. def toString(): String
    Definition Classes
    AnyRef → Any
  77. def tombstoneRetentionMillis(metadata: Metadata): Long

    How long to keep around logically deleted files before physically deleting them.

  78. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  79. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  80. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  81. def withStatusCode[T](statusCode: String, defaultMessage: String, data: Map[String, Any] = Map.empty)(body: ⇒ T): T

    Report a log to indicate some command is running.

    Report a log to indicate some command is running.

    Definition Classes
    DeltaProgressReporter

Inherited from DeltaLogging

Inherited from DatabricksLogging

Inherited from DeltaProgressReporter

Inherited from LoggingShims

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped