class DefaultSource extends RelationProvider with DataSourceRegister with SchemaRelationProvider with CreatableRelationProvider
The default entry source for creating integration between Exasol and Spark.
Additionally, it serves as a factory class to create ExasolRelation instances for Spark application.
- Alphabetic
- By Inheritance
- DefaultSource
- CreatableRelationProvider
- SchemaRelationProvider
- DataSourceRegister
- RelationProvider
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new DefaultSource()
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @HotSpotIntrinsicCandidate()
-
def
createRelation(sqlContext: SQLContext, mode: SaveMode, parameters: Map[String, String], data: DataFrame): BaseRelation
Creates an ExasolRelation after saving a org.apache.spark.sql.DataFrame into Exasol table.
Creates an ExasolRelation after saving a org.apache.spark.sql.DataFrame into Exasol table.
- sqlContext
A Spark org.apache.spark.sql.SQLContext context
- mode
One of Spark save modes, org.apache.spark.sql.SaveMode
- parameters
The parameters provided as options,
tableparameter is required for write- data
A Spark org.apache.spark.sql.DataFrame to save as a Exasol table
- returns
An ExasolRelation relation
- Definition Classes
- DefaultSource → CreatableRelationProvider
-
def
createRelation(sqlContext: SQLContext, parameters: Map[String, String], schema: StructType): BaseRelation
Creates an ExasolRelation using the provided Spark org.apache.spark.sql.SQLContext, parameters and schema.
Creates an ExasolRelation using the provided Spark org.apache.spark.sql.SQLContext, parameters and schema.
- sqlContext
A Spark org.apache.spark.sql.SQLContext context
- parameters
The parameters provided as options,
queryparameter is required for read- schema
A user provided schema used to select columns for the relation
- returns
An ExasolRelation relation
- Definition Classes
- DefaultSource → SchemaRelationProvider
-
def
createRelation(sqlContext: SQLContext, parameters: Map[String, String]): BaseRelation
Creates an ExasolRelation using provided Spark org.apache.spark.sql.SQLContext and parameters.
Creates an ExasolRelation using provided Spark org.apache.spark.sql.SQLContext and parameters.
Since the schema is not provided, it is inferred by running an Exasol query with
LIMIT 1clause.- sqlContext
A Spark org.apache.spark.sql.SQLContext context
- parameters
The parameters provided as options,
queryparameter is required for read- returns
An ExasolRelation relation
- Definition Classes
- DefaultSource → RelationProvider
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
repartitionPerNode(df: DataFrame, nodesCnt: Int): DataFrame
Rearrange dataframe partitions into Exasol data nodes count.
Rearrange dataframe partitions into Exasol data nodes count.
If
nodesCnt<df.rdd.getNumPartitionsthen performdf.coalesce(nodesCnt)
in order to reduce the partition counts.
If
nodesCnt>df.rdd.getNumPartitionsthen performdf.repartition(nodesCnt)
so that there a partition for each data node.
If the number of partitions and nodes are same, then do nothing.
-
def
shortName(): String
- Definition Classes
- DefaultSource → DataSourceRegister
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated