class ExcelDataSource extends TableProvider with DataSourceRegister
Creality Spark Excel data source entry point.
This class is heavily influenced by datasources.v2.FileDataSourceV2. We can not extends FileDataSourceV2 directly because that needs a fallback implementation with V1 API for writing.
- Alphabetic
- By Inheritance
- ExcelDataSource
- DataSourceRegister
- TableProvider
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new ExcelDataSource()
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @HotSpotIntrinsicCandidate()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
getTable(schema: StructType, partitioning: Array[Transform], properties: Map[String, String]): Table
Return a Table instance with the specified table schema, partitioning and properties to do read/write.
Return a Table instance with the specified table schema, partitioning and properties to do read/write. The returned table should report the same schema and partitioning with the specified ones, or Spark may fail the operation.
- schema
The specified table schema.
- partitioning
The specified table partitioning.
- properties
The specified table properties. It's case preserving (contains exactly what users specified) and implementations are free to use it case sensitively or insensitively. It should be able to identify a table, e.g. file path, Kafka topic name, etc.
- Definition Classes
- ExcelDataSource → TableProvider
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
inferPartitioning(options: CaseInsensitiveStringMap): Array[Transform]
Infer the partitioning of the table identified by the given options.
Infer the partitioning of the table identified by the given options.
By default this method returns empty partitioning, please override it if this source support partitioning.
- options
an immutable case-insensitive string-to-string map that can identify a table, e.g. file path, Kafka topic name, etc.
- Definition Classes
- ExcelDataSource → TableProvider
-
def
inferSchema(options: CaseInsensitiveStringMap): StructType
Infer the schema of the table identified by the given options.
Infer the schema of the table identified by the given options.
- options
an immutable case-insensitive string-to-string map that can identify a table, e.g. file path, Kafka topic name, etc.
- Definition Classes
- ExcelDataSource → TableProvider
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
shortName(): String
The string that represents the format that this data source provider uses
The string that represents the format that this data source provider uses
- Definition Classes
- ExcelDataSource → DataSourceRegister
-
def
supportsExternalMetadata(): Boolean
Returns true if the source has the ability of accepting external table metadata when getting tables.
Returns true if the source has the ability of accepting external table metadata when getting tables. The external table metadata includes:
- For table reader: user-specified schema from
DataFrameReader/DataStreamReaderand schema/partitioning stored in Spark catalog. 2. For table writer: the schema of the inputDataframeofDataframeWriter/DataStreamWriter.
- Definition Classes
- ExcelDataSource → TableProvider
- For table reader: user-specified schema from
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated