Packages

c

com.exasol.spark

DefaultSource

class DefaultSource extends RelationProvider with DataSourceRegister with SchemaRelationProvider with CreatableRelationProvider

The default entry source for creating integration between Exasol and Spark.

Additionally, it serves as a factory class to create ExasolRelation instances for Spark application.

Linear Supertypes
CreatableRelationProvider, SchemaRelationProvider, DataSourceRegister, RelationProvider, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DefaultSource
  2. CreatableRelationProvider
  3. SchemaRelationProvider
  4. DataSourceRegister
  5. RelationProvider
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new DefaultSource()

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  6. def createRelation(sqlContext: SQLContext, mode: SaveMode, parameters: Map[String, String], data: DataFrame): BaseRelation

    Creates an ExasolRelation after saving a org.apache.spark.sql.DataFrame into Exasol table.

    Creates an ExasolRelation after saving a org.apache.spark.sql.DataFrame into Exasol table.

    sqlContext

    A Spark org.apache.spark.sql.SQLContext context

    mode

    One of Spark save modes, org.apache.spark.sql.SaveMode

    parameters

    The parameters provided as options, table parameter is required for write

    data

    A Spark org.apache.spark.sql.DataFrame to save as a Exasol table

    returns

    An ExasolRelation relation

    Definition Classes
    DefaultSource → CreatableRelationProvider
  7. def createRelation(sqlContext: SQLContext, parameters: Map[String, String], schema: StructType): BaseRelation

    Creates an ExasolRelation using the provided Spark org.apache.spark.sql.SQLContext, parameters and schema.

    Creates an ExasolRelation using the provided Spark org.apache.spark.sql.SQLContext, parameters and schema.

    sqlContext

    A Spark org.apache.spark.sql.SQLContext context

    parameters

    The parameters provided as options, query parameter is required for read

    schema

    A user provided schema used to select columns for the relation

    returns

    An ExasolRelation relation

    Definition Classes
    DefaultSource → SchemaRelationProvider
  8. def createRelation(sqlContext: SQLContext, parameters: Map[String, String]): BaseRelation

    Creates an ExasolRelation using provided Spark org.apache.spark.sql.SQLContext and parameters.

    Creates an ExasolRelation using provided Spark org.apache.spark.sql.SQLContext and parameters.

    Since the schema is not provided, it is inferred by running an Exasol query with LIMIT 1 clause.

    sqlContext

    A Spark org.apache.spark.sql.SQLContext context

    parameters

    The parameters provided as options, query parameter is required for read

    returns

    An ExasolRelation relation

    Definition Classes
    DefaultSource → RelationProvider
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  12. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  16. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  17. def repartitionPerNode(df: DataFrame, nodesCnt: Int): DataFrame

    Rearrange dataframe partitions into Exasol data nodes count.

    Rearrange dataframe partitions into Exasol data nodes count.

    If nodesCnt < df.rdd.getNumPartitions then perform

    df.coalesce(nodesCnt)

    in order to reduce the partition counts.

    If nodesCnt > df.rdd.getNumPartitions then perform

    df.repartition(nodesCnt)

    so that there a partition for each data node.

    If the number of partitions and nodes are same, then do nothing.

  18. def shortName(): String
    Definition Classes
    DefaultSource → DataSourceRegister
  19. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  20. def toString(): String
    Definition Classes
    AnyRef → Any
  21. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  23. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from CreatableRelationProvider

Inherited from SchemaRelationProvider

Inherited from DataSourceRegister

Inherited from RelationProvider

Inherited from AnyRef

Inherited from Any

Ungrouped