class DeltaSparkSessionExtension extends (SparkSessionExtensions) ⇒ Unit
An extension for Spark SQL to activate Delta SQL parser to support Delta SQL grammar.
Scala example to create a SparkSession with the Delta SQL parser:
import org.apache.spark.sql.SparkSession val spark = SparkSession .builder() .appName("...") .master("...") .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") .getOrCreate()
Java example to create a SparkSession with the Delta SQL parser:
import org.apache.spark.sql.SparkSession; SparkSession spark = SparkSession .builder() .appName("...") .master("...") .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") .getOrCreate();
Python example to create a SparkSession with the Delta SQL parser (PySpark doesn't pick up the
SQL conf "spark.sql.extensions" in Apache Spark 2.4.x, hence we need to activate it manually in
2.4.x. However, because SparkSession has been created and everything has been materialized, we
need to clone a new session to trigger the initialization. See SPARK-25003):
from pyspark.sql import SparkSession spark = SparkSession \ .builder \ .appName("...") \ .master("...") \ .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") \ .getOrCreate() if spark.sparkContext().version < "3.": spark.sparkContext()._jvm.io.delta.sql.DeltaSparkSessionExtension() \ .apply(spark._jsparkSession.extensions()) spark = SparkSession(spark.sparkContext(), spark._jsparkSession.cloneSession())
- Since
0.4.0
Linear Supertypes
Ordering
- Alphabetic
- By Inheritance
Inherited
- DeltaSparkSessionExtension
- Function1
- AnyRef
- Any
- Hide All
- Show All
Visibility
- Public
- All
Instance Constructors
- new DeltaSparkSessionExtension()
Type Members
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
andThen[A](g: (Unit) ⇒ A): (SparkSessionExtensions) ⇒ A
- Definition Classes
- Function1
- Annotations
- @unspecialized()
-
def
apply(extensions: SparkSessionExtensions): Unit
- Definition Classes
- DeltaSparkSessionExtension → Function1
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
compose[A](g: (A) ⇒ SparkSessionExtensions): (A) ⇒ Unit
- Definition Classes
- Function1
- Annotations
- @unspecialized()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- Function1 → AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()