object DeltaSourceMetadataEvolutionSupport

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DeltaSourceMetadataEvolutionSupport
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final val SQL_CONF_UNBLOCK_ALL: String("allowSourceColumnRenameAndDrop")
  5. final val SQL_CONF_UNBLOCK_DROP: String("allowSourceColumnDrop")
  6. final val SQL_CONF_UNBLOCK_RENAME: String("allowSourceColumnRename")
  7. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  8. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. def getCheckpointHash(path: String): Int
  13. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  14. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  15. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  17. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  18. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  19. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  20. def toString(): String
    Definition Classes
    AnyRef → Any
  21. def validateIfSchemaChangeCanBeUnblockedWithSQLConf(spark: SparkSession, metadataPath: String, currentSchema: PersistedMetadata, previousSchema: PersistedMetadata): Unit

    Given a non-additive operation type from a previous schema evolution, check we can process using the new schema given any SQL conf users have explicitly set to unblock.

    Given a non-additive operation type from a previous schema evolution, check we can process using the new schema given any SQL conf users have explicitly set to unblock. The SQL conf can take one of following formats: 1. spark.databricks.delta.streaming.allowSourceColumnRenameAndDrop = true -> allows all non-additive schema changes to propagate. 2. spark.databricks.delta.streaming.allowSourceColumnRenameAndDrop.$checkpointHash = true -> allows all non-additive schema changes to propagate for this particular stream 3. spark.databricks.delta.streaming.allowSourceColumnRenameAndDrop.$checkpointHash = $deltaVersion

    The allowSourceColumnRenameAndDrop can be replaced with: 1. allowSourceColumnRename to just allow column rename 2. allowSourceColumnDrop to just allow column drops

    We will check for any of these configs given the non-additive operation, and throw a proper error message to instruct the user to set the SQL conf if they would like to unblock.

    metadataPath

    The path to the source-unique metadata location under checkpoint

    currentSchema

    The current persisted schema

    previousSchema

    The previous persisted schema

    Attributes
    protected[sources]
  22. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped