Class

net.snowflake.spark.snowflake.Parameters

MergedParameters

Related Doc: package Parameters

Permalink

case class MergedParameters(parameters: Map[String, String]) extends Product with Serializable

Adds validators and accessors to string map

Linear Supertypes
Serializable, Serializable, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. MergedParameters
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new MergedParameters(parameters: Map[String, String])

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def autoPushdown: Boolean

    Permalink

    Snowflake automatically enable/disable pushdown function

  6. def awsAccessKey: Option[String]

    Permalink
  7. def awsSecretKey: Option[String]

    Permalink
  8. def azureSAS: Option[String]

    Permalink
  9. def bindVariableEnabled: Boolean

    Permalink
  10. def checkBucketConfiguration: Boolean

    Permalink

    Returns true if bucket lifecycle configuration should be checked

  11. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  12. def columnMap: Option[Map[String, String]]

    Permalink

    Retrieve Column mapping data.

    Retrieve Column mapping data. None if empty

  13. def columnMapping: String

    Permalink
  14. def columnMismatchBehavior: String

    Permalink
  15. def continueOnError: Boolean

    Permalink

    Set on_error parameter to continue in COPY command todo: create data validation function in spark side instead of using COPY COMMAND

  16. def createPerQueryTempDir(): String

    Permalink

    Creates a per-query subdirectory in the rootTempDir, with a random UUID.

  17. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  18. def expectedPartitionCount: Int

    Permalink
  19. def expectedPartitionSize: Long

    Permalink
  20. def extraCopyOptions: String

    Permalink

    Extra options to append to the Snowflake COPY command (e.g.

    Extra options to append to the Snowflake COPY command (e.g. "MAXERROR 100").

  21. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  22. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  23. def getQueryResultFormat: Option[String]

    Permalink

    Snowflake query result format

  24. def getTimeOutputFormat: Option[String]

    Permalink

    Snowflake time output format

  25. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  26. def isSslON: Boolean

    Permalink
  27. def isTimezoneSnowflake: Boolean

    Permalink
  28. def isTimezoneSnowflakeDefault: Boolean

    Permalink
  29. def isTimezoneSpark: Boolean

    Permalink
  30. def keepOriginalColumnNameCase: Boolean

    Permalink
  31. def maxRetryCount: Int

    Permalink
  32. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  33. def nonProxyHosts: Option[String]

    Permalink
  34. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  35. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  36. lazy val parallelism: Option[Int]

    Permalink

    Number of threads used for PUT/GET.

  37. val parameters: Map[String, String]

    Permalink
  38. def postActions: Array[String]

    Permalink

    List of semi-colon separated SQL statements to run after successful write operations.

    List of semi-colon separated SQL statements to run after successful write operations. This can be useful for running GRANT operations to make your new tables readable to other users and groups.

    If the action string contains %s, the table name will be substituted in, in case a staging table is being used.

    Defaults to empty.

  39. def preActions: Array[String]

    Permalink

    List of semi-colon separated SQL statements to run before write operations.

    List of semi-colon separated SQL statements to run before write operations. This can be useful for running DELETE operations to clean up data

    If the action string contains %s, the table name will be substituted in, in case a staging table is being used.

    Defaults to empty.

  40. def privateKey: Option[PrivateKey]

    Permalink

    Generate private key form pem key value

    Generate private key form pem key value

    returns

    private key object

  41. def proxyHost: Option[String]

    Permalink
  42. lazy val proxyInfo: Option[ProxyInfo]

    Permalink
  43. def proxyPassword: Option[String]

    Permalink
  44. def proxyPort: Option[String]

    Permalink
  45. def proxyUser: Option[String]

    Permalink
  46. def purge(): Boolean

    Permalink

    Whether or not to have PURGE in the COPY statement generated by the Spark connector

  47. def query: Option[String]

    Permalink

    The Snowflake query to be used as the target when loading data.

  48. lazy val rootTempDir: String

    Permalink

    A root directory to be used for intermediate data exchange, expected to be on cloud storage (S3 or Azure storage), or somewhere that can be written to and read from by Snowflake.

    A root directory to be used for intermediate data exchange, expected to be on cloud storage (S3 or Azure storage), or somewhere that can be written to and read from by Snowflake. Make sure that credentials are available for this cloud provider.

  49. lazy val rootTempDirStorageType: FSType

    Permalink
  50. def s3maxfilesize: String

    Permalink

    Max file size used to move data out from Snowflake

  51. def setColumnMap(fromSchema: Option[StructType], toSchema: Option[StructType]): Unit

    Permalink

    set column map

  52. def sfAccount: Option[String]

    Permalink

    Snowflake account - optional

  53. def sfCompress: Boolean

    Permalink

    Snowflake use compression on/off - "on" by default

  54. def sfDatabase: String

    Permalink

    Snowflake database name

  55. def sfExtraOptions: Map[String, AnyRef]

    Permalink

    Returns a map of options that are not known to the connector, and are passed verbosely to the JDBC driver

  56. def sfPassword: String

    Permalink

    Snowflake password

  57. def sfRole: Option[String]

    Permalink

    Snowflake role - optional

  58. def sfSSL: String

    Permalink

    Snowflake SSL on/off - "on" by default

  59. def sfSchema: String

    Permalink

    Snowflake schema

  60. def sfTimezone: Option[String]

    Permalink

    Snowflake timezone- optional

  61. def sfURL: String

    Permalink

    URL pointing to the snowflake database, simply host:port

  62. def sfUser: String

    Permalink

    Snowflake user

  63. def sfWarehouse: Option[String]

    Permalink

    Snowflake warehouse

  64. def storagePath: Option[String]

    Permalink
  65. def streamingStage: Option[String]

    Permalink
  66. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  67. def table: Option[TableName]

    Permalink

    The Snowflake table to be used as the target when loading or writing data.

  68. def temporaryAWSCredentials: Option[AWSCredentials]

    Permalink

    Temporary AWS credentials which are passed to Snowflake.

    Temporary AWS credentials which are passed to Snowflake. These only need to be supplied by the user when Hadoop is configured to authenticate to S3 via IAM roles assigned to EC2 instances.

  69. def temporaryAzureStorageCredentials: Option[StorageCredentialsSharedAccessSignature]

    Permalink

    SAS Token to be passed to Snowflake to access data in Azure storage.

    SAS Token to be passed to Snowflake to access data in Azure storage. We currently don't support full storage account key so this has to be provided if customer would like to load data through their storage account directly.

  70. def toString(): String

    Permalink
    Definition Classes
    MergedParameters → AnyRef → Any
  71. def truncateColumns(): Boolean

    Permalink

    Whether or not to have TRUNCATE_COLUMNS in the COPY statement generated by the Spark connector.

  72. def truncateTable: Boolean

    Permalink

    Truncate table when overwriting.

    Truncate table when overwriting. Keep the table schema

  73. def useCopyUnload: Boolean

    Permalink
  74. def useProxy: Boolean

    Permalink

    Proxy related parameters.

  75. def useStagingTable: Boolean

    Permalink

    When true, data is always loaded into a new temporary table when performing an overwrite.

    When true, data is always loaded into a new temporary table when performing an overwrite. This is to ensure that the whole load process succeeds before dropping any data from Snowflake, which can be useful if, in the event of failures, stale data is better than no data for your systems.

    Defaults to true.

  76. lazy val usingExternalStage: Boolean

    Permalink
  77. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  78. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  79. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped