object DeltaDataSource extends DatabricksLogging
- Alphabetic
- By Inheritance
- DeltaDataSource
- DatabricksLogging
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final val CDC_ENABLED_KEY: String("readChangeFeed")
- final val CDC_ENABLED_KEY_LEGACY: String("readChangeData")
- final val CDC_END_TIMESTAMP_KEY: String("endingTimestamp")
- final val CDC_END_VERSION_KEY: String("endingVersion")
- final val CDC_START_TIMESTAMP_KEY: String("startingTimestamp")
- final val CDC_START_VERSION_KEY: String("startingVersion")
- final val TIME_TRAVEL_SOURCE_KEY: String("__time_travel_source__")
- final val TIME_TRAVEL_TIMESTAMP_KEY: String("timestampAsOf")
The option key for time traveling using a timestamp.
The option key for time traveling using a timestamp. The timestamp should be a valid timestamp string which can be cast to a timestamp type.
- final val TIME_TRAVEL_VERSION_KEY: String("versionAsOf")
The option key for time traveling using a version of a table.
The option key for time traveling using a version of a table. This value should be castable to a long.
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- def decodePartitioningColumns(str: String): Seq[String]
- def encodePartitioningColumns(columns: Seq[String]): String
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def extractSchemaTrackingLocationConfig(spark: SparkSession, parameters: Map[String, String]): Option[String]
Extract the schema tracking location from options.
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def getMetadataTrackingLogForDeltaSource(spark: SparkSession, sourceSnapshot: SnapshotDescriptor, parameters: Map[String, String], sourceMetadataPathOpt: Option[String] = None, mergeConsecutiveSchemaChanges: Boolean = false): Option[DeltaSourceMetadataTrackingLog]
Create a schema log for Delta streaming source if possible
- def getTimeTravelVersion(parameters: Map[String, String]): Option[DeltaTimeTravelSpec]
Extracts whether users provided the option to time travel a relation.
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- def logConsole(line: String): Unit
- Definition Classes
- DatabricksLogging
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def parsePathIdentifier(spark: SparkSession, userPath: String, options: Map[String, String]): (Path, Seq[(String, String)], Option[DeltaTimeTravelSpec])
For Delta, we allow certain magic to be performed through the paths that are provided by users.
For Delta, we allow certain magic to be performed through the paths that are provided by users. Normally, a user specified path should point to the root of a Delta table. However, some users are used to providing specific partition values through the path, because of how expensive it was to perform partition discovery before. We treat these partition values as logical partition filters, if a table does not exist at the provided path.
In addition, we allow users to provide time travel specifications through the path. This is provided after an
@symbol after a path followed by a time specification inyyyyMMddHHmmssSSSformat, or a version number preceded by av.This method parses these specifications and returns these modifiers only if a path does not really exist at the provided path. We first parse out the time travel specification, and then the partition filters. For example, a path specified as: /some/path/partition=1@v1234 will be parsed into
/some/pathwith filterspartition=1and a time travel spec of version 1234.- returns
A tuple of the root path of the Delta table, partition filters, and time travel options
- def recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
- Definition Classes
- DatabricksLogging
- def recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = METRIC_OPERATION_DURATION, silent: Boolean = true)(thunk: => S): S
- Definition Classes
- DatabricksLogging
- def recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
- Definition Classes
- DatabricksLogging
- def recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
- Definition Classes
- DatabricksLogging
- def recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
- Definition Classes
- DatabricksLogging
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- def verifyAndCreatePartitionFilters(userPath: String, snapshot: Snapshot, partitionFilters: Seq[(String, String)]): Seq[Expression]
Verifies that the provided partition filters are valid and returns the corresponding expressions.
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()