class DefaultSource extends RelationProvider with SchemaRelationProvider with CreatableRelationProvider with StreamSinkProvider with DataSourceRegister
Snowflake Source implementation for Spark SQL Major TODO points:
- Add support for compression Snowflake->Spark
- Add support for using Snowflake Stage files, so the user doesn't need to provide AWS passwords
- Add support for VARIANT
Linear Supertypes
Ordering
- Alphabetic
- By Inheritance
Inherited
- DefaultSource
- DataSourceRegister
- StreamSinkProvider
- CreatableRelationProvider
- SchemaRelationProvider
- RelationProvider
- AnyRef
- Any
- Hide All
- Show All
Visibility
- Public
- All
Instance Constructors
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
createRelation(sqlContext: SQLContext, saveMode: SaveMode, parameters: Map[String, String], data: DataFrame): BaseRelation
Creates a Relation instance by first writing the contents of the given DataFrame to Snowflake
Creates a Relation instance by first writing the contents of the given DataFrame to Snowflake
- Definition Classes
- DefaultSource → CreatableRelationProvider
-
def
createRelation(sqlContext: SQLContext, parameters: Map[String, String], schema: StructType): BaseRelation
Load a
SnowflakeRelationusing user-provided schema, so no inference over JDBC will be used.Load a
SnowflakeRelationusing user-provided schema, so no inference over JDBC will be used.- Definition Classes
- DefaultSource → SchemaRelationProvider
-
def
createRelation(sqlContext: SQLContext, parameters: Map[String, String]): BaseRelation
Create a new
SnowflakeRelationinstance using parameters from Spark SQL DDL.Create a new
SnowflakeRelationinstance using parameters from Spark SQL DDL. Resolves the schema using JDBC connection over provided URL, which must contain credentials.- Definition Classes
- DefaultSource → RelationProvider
-
def
createSink(sqlContext: SQLContext, parameters: Map[String, String], partitionColumns: Seq[String], outputMode: OutputMode): Sink
- Definition Classes
- DefaultSource → StreamSinkProvider
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
shortName(): String
- Definition Classes
- DefaultSource → DataSourceRegister
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()