Class DataprocJob.Builder

  • All Implemented Interfaces:
    software.amazon.jsii.Builder<DataprocJob>
    Enclosing class:
    DataprocJob

    @Stability(Stable)
    public static final class DataprocJob.Builder
    extends Object
    implements software.amazon.jsii.Builder<DataprocJob>
    A fluent builder for DataprocJob.
    • Method Detail

      • create

        @Stability(Stable)
        public static DataprocJob.Builder create​(software.constructs.Construct scope,
                                                 String id)
        Parameters:
        scope - The scope in which to define this construct. This parameter is required.
        id - The scoped construct ID. This parameter is required.
        Returns:
        a new instance of DataprocJob.Builder.
      • connection

        @Stability(Experimental)
        public DataprocJob.Builder connection​(com.hashicorp.cdktf.SSHProvisionerConnection connection)
        Parameters:
        connection - This parameter is required.
        Returns:
        this
      • connection

        @Stability(Experimental)
        public DataprocJob.Builder connection​(com.hashicorp.cdktf.WinrmProvisionerConnection connection)
        Parameters:
        connection - This parameter is required.
        Returns:
        this
      • count

        @Stability(Experimental)
        public DataprocJob.Builder count​(Number count)
        Parameters:
        count - This parameter is required.
        Returns:
        this
      • count

        @Stability(Experimental)
        public DataprocJob.Builder count​(com.hashicorp.cdktf.TerraformCount count)
        Parameters:
        count - This parameter is required.
        Returns:
        this
      • dependsOn

        @Stability(Experimental)
        public DataprocJob.Builder dependsOn​(List<? extends com.hashicorp.cdktf.ITerraformDependable> dependsOn)
        Parameters:
        dependsOn - This parameter is required.
        Returns:
        this
      • forEach

        @Stability(Experimental)
        public DataprocJob.Builder forEach​(com.hashicorp.cdktf.ITerraformIterator forEach)
        Parameters:
        forEach - This parameter is required.
        Returns:
        this
      • lifecycle

        @Stability(Experimental)
        public DataprocJob.Builder lifecycle​(com.hashicorp.cdktf.TerraformResourceLifecycle lifecycle)
        Parameters:
        lifecycle - This parameter is required.
        Returns:
        this
      • provider

        @Stability(Experimental)
        public DataprocJob.Builder provider​(com.hashicorp.cdktf.TerraformProvider provider)
        Parameters:
        provider - This parameter is required.
        Returns:
        this
      • provisioners

        @Stability(Experimental)
        public DataprocJob.Builder provisioners​(List<? extends Object> provisioners)
        Parameters:
        provisioners - This parameter is required.
        Returns:
        this
      • placement

        @Stability(Stable)
        public DataprocJob.Builder placement​(DataprocJobPlacement placement)
        placement block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#placement DataprocJob#placement}

        Parameters:
        placement - placement block. This parameter is required.
        Returns:
        this
      • forceDelete

        @Stability(Stable)
        public DataprocJob.Builder forceDelete​(Boolean forceDelete)
        By default, you can only delete inactive jobs within Dataproc.

        Setting this to true, and calling destroy, will ensure that the job is first cancelled before issuing the delete. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#force_delete DataprocJob#force_delete}

        Parameters:
        forceDelete - By default, you can only delete inactive jobs within Dataproc. This parameter is required.
        Returns:
        this
      • forceDelete

        @Stability(Stable)
        public DataprocJob.Builder forceDelete​(com.hashicorp.cdktf.IResolvable forceDelete)
        By default, you can only delete inactive jobs within Dataproc.

        Setting this to true, and calling destroy, will ensure that the job is first cancelled before issuing the delete. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#force_delete DataprocJob#force_delete}

        Parameters:
        forceDelete - By default, you can only delete inactive jobs within Dataproc. This parameter is required.
        Returns:
        this
      • hadoopConfig

        @Stability(Stable)
        public DataprocJob.Builder hadoopConfig​(DataprocJobHadoopConfig hadoopConfig)
        hadoop_config block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#hadoop_config DataprocJob#hadoop_config}

        Parameters:
        hadoopConfig - hadoop_config block. This parameter is required.
        Returns:
        this
      • hiveConfig

        @Stability(Stable)
        public DataprocJob.Builder hiveConfig​(DataprocJobHiveConfig hiveConfig)
        hive_config block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#hive_config DataprocJob#hive_config}

        Parameters:
        hiveConfig - hive_config block. This parameter is required.
        Returns:
        this
      • id

        @Stability(Stable)
        public DataprocJob.Builder id​(String id)
        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#id DataprocJob#id}.

        Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.

        Parameters:
        id - Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#id DataprocJob#id}. This parameter is required.
        Returns:
        this
      • labels

        @Stability(Stable)
        public DataprocJob.Builder labels​(Map<String,​String> labels)
        Optional. The labels to associate with this job.

        **Note**: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#labels DataprocJob#labels}

        Parameters:
        labels - Optional. The labels to associate with this job. This parameter is required.
        Returns:
        this
      • pigConfig

        @Stability(Stable)
        public DataprocJob.Builder pigConfig​(DataprocJobPigConfig pigConfig)
        pig_config block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#pig_config DataprocJob#pig_config}

        Parameters:
        pigConfig - pig_config block. This parameter is required.
        Returns:
        this
      • prestoConfig

        @Stability(Stable)
        public DataprocJob.Builder prestoConfig​(DataprocJobPrestoConfig prestoConfig)
        presto_config block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#presto_config DataprocJob#presto_config}

        Parameters:
        prestoConfig - presto_config block. This parameter is required.
        Returns:
        this
      • project

        @Stability(Stable)
        public DataprocJob.Builder project​(String project)
        The project in which the cluster can be found and jobs subsequently run against.

        If it is not provided, the provider project is used. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#project DataprocJob#project}

        Parameters:
        project - The project in which the cluster can be found and jobs subsequently run against. This parameter is required.
        Returns:
        this
      • pysparkConfig

        @Stability(Stable)
        public DataprocJob.Builder pysparkConfig​(DataprocJobPysparkConfig pysparkConfig)
        pyspark_config block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#pyspark_config DataprocJob#pyspark_config}

        Parameters:
        pysparkConfig - pyspark_config block. This parameter is required.
        Returns:
        this
      • reference

        @Stability(Stable)
        public DataprocJob.Builder reference​(DataprocJobReference reference)
        reference block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#reference DataprocJob#reference}

        Parameters:
        reference - reference block. This parameter is required.
        Returns:
        this
      • region

        @Stability(Stable)
        public DataprocJob.Builder region​(String region)
        The Cloud Dataproc region.

        This essentially determines which clusters are available for this job to be submitted to. If not specified, defaults to global. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#region DataprocJob#region}

        Parameters:
        region - The Cloud Dataproc region. This parameter is required.
        Returns:
        this
      • scheduling

        @Stability(Stable)
        public DataprocJob.Builder scheduling​(DataprocJobScheduling scheduling)
        scheduling block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#scheduling DataprocJob#scheduling}

        Parameters:
        scheduling - scheduling block. This parameter is required.
        Returns:
        this
      • sparkConfig

        @Stability(Stable)
        public DataprocJob.Builder sparkConfig​(DataprocJobSparkConfig sparkConfig)
        spark_config block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#spark_config DataprocJob#spark_config}

        Parameters:
        sparkConfig - spark_config block. This parameter is required.
        Returns:
        this
      • sparksqlConfig

        @Stability(Stable)
        public DataprocJob.Builder sparksqlConfig​(DataprocJobSparksqlConfig sparksqlConfig)
        sparksql_config block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#sparksql_config DataprocJob#sparksql_config}

        Parameters:
        sparksqlConfig - sparksql_config block. This parameter is required.
        Returns:
        this
      • timeouts

        @Stability(Stable)
        public DataprocJob.Builder timeouts​(DataprocJobTimeouts timeouts)
        timeouts block.

        Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#timeouts DataprocJob#timeouts}

        Parameters:
        timeouts - timeouts block. This parameter is required.
        Returns:
        this
      • build

        @Stability(Stable)
        public DataprocJob build()
        Specified by:
        build in interface software.amazon.jsii.Builder<DataprocJob>
        Returns:
        a newly built instance of DataprocJob.