object DeltaTimeTravelSpecShims
- Alphabetic
- By Inheritance
- DeltaTimeTravelSpecShims
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
validateTimeTravelSpec(currSpecOpt: Option[DeltaTimeTravelSpec], newSpecOpt: Option[DeltaTimeTravelSpec]): Unit
Ensures only a single time travel syntax is used (i.e.
Ensures only a single time travel syntax is used (i.e. not version AND timestamp).
Handles a breaking change between Spark 3.5 and 4.0 which added support for DataFrame-based time travel in Spark (https://github.com/apache/spark/pull/43403).
TLDR: Starting in Spark 4.0, we end up with two time travel specifications in DeltaTableV2 if options are used to specify the time travel version/timestamp. This breaks an existing check we had (against Spark 3.5) which ensures only one time travel specification is used.
The solution to get around this is just to ignore two specs if they are the same. If the user did actually provide two different time travel specs, that would have been caught by Spark earlier.
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()