Packages

case class TextTable(name: String, sparkSession: SparkSession, options: CaseInsensitiveStringMap, paths: Seq[String], userSpecifiedSchema: Option[StructType], fallbackFileFormat: Class[_ <: FileFormat]) extends FileTable with Product with Serializable

Linear Supertypes
Serializable, Serializable, Product, Equals, FileTable, SupportsWrite, SupportsRead, Table, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. TextTable
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. FileTable
  7. SupportsWrite
  8. SupportsRead
  9. Table
  10. AnyRef
  11. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new TextTable(name: String, sparkSession: SparkSession, options: CaseInsensitiveStringMap, paths: Seq[String], userSpecifiedSchema: Option[StructType], fallbackFileFormat: Class[_ <: FileFormat])

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def capabilities(): Set[TableCapability]
    Definition Classes
    FileTable → Table
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  7. def columns(): Array[connector.catalog.Column]
    Definition Classes
    Table
  8. lazy val dataSchema: StructType
    Definition Classes
    FileTable
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. val fallbackFileFormat: Class[_ <: FileFormat]

    Returns a V1 FileFormat class of the same file data source.

    Returns a V1 FileFormat class of the same file data source. This is a solution for the following cases: 1. File datasource V2 implementations cause regression. Users can disable the problematic data source via SQL configuration and fall back to FileFormat. 2. Catalog support is required, which is still under development for data source V2.

    Definition Classes
    TextTableFileTable
  11. lazy val fileIndex: PartitioningAwareFileIndex
    Definition Classes
    FileTable
  12. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. def formatName: String

    The string that represents the format that this data source provider uses.

    The string that represents the format that this data source provider uses. This is overridden by children to provide a nice alias for the data source. For example:

    override def formatName(): String = "ORC"
    Definition Classes
    TextTableFileTable
  14. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  15. def inferSchema(files: Seq[FileStatus]): Option[StructType]

    When possible, this method should return the schema of the given files.

    When possible, this method should return the schema of the given files. When the format does not support inference, or no valid files are given should return None. In these cases Spark will require that user specify the schema manually.

    Definition Classes
    TextTableFileTable
  16. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  17. val name: String
    Definition Classes
    TextTable → Table
  18. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  19. def newScanBuilder(options: CaseInsensitiveStringMap): TextScanBuilder
    Definition Classes
    TextTable → SupportsRead
  20. def newWriteBuilder(info: LogicalWriteInfo): WriteBuilder
    Definition Classes
    TextTable → SupportsWrite
  21. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  22. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  23. val options: CaseInsensitiveStringMap
  24. def partitioning(): Array[Transform]
    Definition Classes
    FileTable → Table
  25. val paths: Seq[String]
  26. def properties(): Map[String, String]
    Definition Classes
    FileTable → Table
  27. lazy val schema: StructType
    Definition Classes
    FileTable → Table
  28. val sparkSession: SparkSession
  29. def supportsDataType(dataType: DataType): Boolean

    Returns whether this format supports the given DataType in read/write path.

    Returns whether this format supports the given DataType in read/write path. By default all data types are supported.

    Definition Classes
    TextTableFileTable
  30. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  31. val userSpecifiedSchema: Option[StructType]
  32. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from FileTable

Inherited from SupportsWrite

Inherited from SupportsRead

Inherited from Table

Inherited from AnyRef

Inherited from Any

Ungrouped