Packages

package temp

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. All

Type Members

  1. case class AlterTableClusterBy(table: LogicalPlan, clusterBySpec: Option[ClusterBySpec]) extends LogicalPlan with AlterTableCommand with Product with Serializable

    The logical plan of the following commands:

    The logical plan of the following commands:

    • ALTER TABLE ... CLUSTER BY (col1, col2, ...)
    • ALTER TABLE ... CLUSTER BY NONE
  2. case class ClusterBy(clusteringColumns: Seq[NamedReference]) extends TableChange with Product with Serializable

    A TableChange to alter clustering columns for a table.

  3. case class ClusterByParserUtils(clusterByPlan: ClusterByPlan, delegate: ParserInterface) extends Product with Serializable

    Parser utils for parsing a ClusterByPlan and converts it to table properties.

    Parser utils for parsing a ClusterByPlan and converts it to table properties.

    This class will be removed when we integrate with OSS Spark's CLUSTER BY implementation.

    See also

    https://github.com/apache/spark/pull/42577

  4. case class ClusterByPlan(clusterBySpec: ClusterBySpec, startIndex: Int, stopIndex: Int, parenStartIndex: Int, parenStopIndex: Int, ctx: ParserRuleContext) extends LogicalPlan with LeafNode with Product with Serializable

    A LogicalPlan representing a CLUSTER BY clause.

    A LogicalPlan representing a CLUSTER BY clause.

    This class will be removed when we integrate with OSS Spark's CLUSTER BY implementation.

    See also

    https://github.com/apache/spark/pull/42577

  5. case class ClusterBySpec(columnNames: Seq[NamedReference]) extends Product with Serializable

    A container for clustering information.

    A container for clustering information. Copied from OSS Spark.

    This class will be removed when we integrate with OSS Spark's CLUSTER BY implementation.

    columnNames

    the names of the columns used for clustering.

    See also

    https://github.com/apache/spark/pull/42577

  6. final case class ClusterByTransform(columnNames: Seq[NamedReference]) extends Transform with Product with Serializable

    Minimal version of Spark's ClusterByTransform.

    Minimal version of Spark's ClusterByTransform. We'll remove this when we integrate with OSS Spark's CLUSTER BY implementation.

    This class represents a transform for ClusterBySpec. This is used to bundle ClusterBySpec in CreateTable's partitioning transforms to pass it down to analyzer/delta.

Value Members

  1. object ClusterBySpec extends Serializable
  2. object ClusterByTransform extends Serializable

    Convenience extractor for ClusterByTransform.

Ungrouped