Class DataflowJob.Builder
- java.lang.Object
-
- com.hashicorp.cdktf.providers.google.dataflow_job.DataflowJob.Builder
-
- All Implemented Interfaces:
software.amazon.jsii.Builder<DataflowJob>
- Enclosing class:
- DataflowJob
@Stability(Stable) public static final class DataflowJob.Builder extends Object implements software.amazon.jsii.Builder<DataflowJob>
A fluent builder forDataflowJob.
-
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description DataflowJob.BuilderadditionalExperiments(List<String> additionalExperiments)List of experiments that should be used by the job.DataflowJobbuild()DataflowJob.Builderconnection(com.hashicorp.cdktf.SSHProvisionerConnection connection)DataflowJob.Builderconnection(com.hashicorp.cdktf.WinrmProvisionerConnection connection)DataflowJob.Buildercount(com.hashicorp.cdktf.TerraformCount count)DataflowJob.Buildercount(Number count)static DataflowJob.Buildercreate(software.constructs.Construct scope, String id)DataflowJob.BuilderdependsOn(List<? extends com.hashicorp.cdktf.ITerraformDependable> dependsOn)DataflowJob.BuilderenableStreamingEngine(com.hashicorp.cdktf.IResolvable enableStreamingEngine)Indicates if the job should use the streaming engine feature.DataflowJob.BuilderenableStreamingEngine(Boolean enableStreamingEngine)Indicates if the job should use the streaming engine feature.DataflowJob.BuilderforEach(com.hashicorp.cdktf.ITerraformIterator forEach)DataflowJob.Builderid(String id)Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#id DataflowJob#id}.DataflowJob.BuilderipConfiguration(String ipConfiguration)The configuration for VM IPs.DataflowJob.BuilderkmsKeyName(String kmsKeyName)The name for the Cloud KMS key for the job.DataflowJob.Builderlabels(Map<String,String> labels)User labels to be specified for the job.DataflowJob.Builderlifecycle(com.hashicorp.cdktf.TerraformResourceLifecycle lifecycle)DataflowJob.BuildermachineType(String machineType)The machine type to use for the job.DataflowJob.BuildermaxWorkers(Number maxWorkers)The number of workers permitted to work on the job.DataflowJob.Buildername(String name)A unique name for the resource, required by Dataflow.DataflowJob.Buildernetwork(String network)The network to which VMs will be assigned.DataflowJob.BuilderonDelete(String onDelete)One of "drain" or "cancel".DataflowJob.Builderparameters(Map<String,String> parameters)Key/Value pairs to be passed to the Dataflow job (as used in the template).DataflowJob.Builderproject(String project)The project in which the resource belongs.DataflowJob.Builderprovider(com.hashicorp.cdktf.TerraformProvider provider)DataflowJob.Builderprovisioners(List<? extends Object> provisioners)DataflowJob.Builderregion(String region)The region in which the created job should run.DataflowJob.BuilderserviceAccountEmail(String serviceAccountEmail)The Service Account email used to create the job.DataflowJob.BuilderskipWaitOnJobTermination(com.hashicorp.cdktf.IResolvable skipWaitOnJobTermination)If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.DataflowJob.BuilderskipWaitOnJobTermination(Boolean skipWaitOnJobTermination)If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.DataflowJob.Buildersubnetwork(String subnetwork)The subnetwork to which VMs will be assigned.DataflowJob.BuildertempGcsLocation(String tempGcsLocation)A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.DataflowJob.BuildertemplateGcsPath(String templateGcsPath)The Google Cloud Storage path to the Dataflow job template.DataflowJob.Buildertimeouts(DataflowJobTimeouts timeouts)timeouts block.DataflowJob.BuildertransformNameMapping(Map<String,String> transformNameMapping)Only applicable when updating a pipeline.DataflowJob.Builderzone(String zone)The zone in which the created job should run.
-
-
-
Method Detail
-
create
@Stability(Stable) public static DataflowJob.Builder create(software.constructs.Construct scope, String id)
- Parameters:
scope- The scope in which to define this construct. This parameter is required.id- The scoped construct ID. This parameter is required.- Returns:
- a new instance of
DataflowJob.Builder.
-
connection
@Stability(Experimental) public DataflowJob.Builder connection(com.hashicorp.cdktf.SSHProvisionerConnection connection)
- Parameters:
connection- This parameter is required.- Returns:
this
-
connection
@Stability(Experimental) public DataflowJob.Builder connection(com.hashicorp.cdktf.WinrmProvisionerConnection connection)
- Parameters:
connection- This parameter is required.- Returns:
this
-
count
@Stability(Experimental) public DataflowJob.Builder count(Number count)
- Parameters:
count- This parameter is required.- Returns:
this
-
count
@Stability(Experimental) public DataflowJob.Builder count(com.hashicorp.cdktf.TerraformCount count)
- Parameters:
count- This parameter is required.- Returns:
this
-
dependsOn
@Stability(Experimental) public DataflowJob.Builder dependsOn(List<? extends com.hashicorp.cdktf.ITerraformDependable> dependsOn)
- Parameters:
dependsOn- This parameter is required.- Returns:
this
-
forEach
@Stability(Experimental) public DataflowJob.Builder forEach(com.hashicorp.cdktf.ITerraformIterator forEach)
- Parameters:
forEach- This parameter is required.- Returns:
this
-
lifecycle
@Stability(Experimental) public DataflowJob.Builder lifecycle(com.hashicorp.cdktf.TerraformResourceLifecycle lifecycle)
- Parameters:
lifecycle- This parameter is required.- Returns:
this
-
provider
@Stability(Experimental) public DataflowJob.Builder provider(com.hashicorp.cdktf.TerraformProvider provider)
- Parameters:
provider- This parameter is required.- Returns:
this
-
provisioners
@Stability(Experimental) public DataflowJob.Builder provisioners(List<? extends Object> provisioners)
- Parameters:
provisioners- This parameter is required.- Returns:
this
-
name
@Stability(Stable) public DataflowJob.Builder name(String name)
A unique name for the resource, required by Dataflow.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#name DataflowJob#name}
- Parameters:
name- A unique name for the resource, required by Dataflow. This parameter is required.- Returns:
this
-
tempGcsLocation
@Stability(Stable) public DataflowJob.Builder tempGcsLocation(String tempGcsLocation)
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#temp_gcs_location DataflowJob#temp_gcs_location}
- Parameters:
tempGcsLocation- A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data. This parameter is required.- Returns:
this
-
templateGcsPath
@Stability(Stable) public DataflowJob.Builder templateGcsPath(String templateGcsPath)
The Google Cloud Storage path to the Dataflow job template.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#template_gcs_path DataflowJob#template_gcs_path}
- Parameters:
templateGcsPath- The Google Cloud Storage path to the Dataflow job template. This parameter is required.- Returns:
this
-
additionalExperiments
@Stability(Stable) public DataflowJob.Builder additionalExperiments(List<String> additionalExperiments)
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#additional_experiments DataflowJob#additional_experiments}
- Parameters:
additionalExperiments- List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"]. This parameter is required.- Returns:
this
-
enableStreamingEngine
@Stability(Stable) public DataflowJob.Builder enableStreamingEngine(Boolean enableStreamingEngine)
Indicates if the job should use the streaming engine feature.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#enable_streaming_engine DataflowJob#enable_streaming_engine}
- Parameters:
enableStreamingEngine- Indicates if the job should use the streaming engine feature. This parameter is required.- Returns:
this
-
enableStreamingEngine
@Stability(Stable) public DataflowJob.Builder enableStreamingEngine(com.hashicorp.cdktf.IResolvable enableStreamingEngine)
Indicates if the job should use the streaming engine feature.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#enable_streaming_engine DataflowJob#enable_streaming_engine}
- Parameters:
enableStreamingEngine- Indicates if the job should use the streaming engine feature. This parameter is required.- Returns:
this
-
id
@Stability(Stable) public DataflowJob.Builder id(String id)
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#id DataflowJob#id}.Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
- Parameters:
id- Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#id DataflowJob#id}. This parameter is required.- Returns:
this
-
ipConfiguration
@Stability(Stable) public DataflowJob.Builder ipConfiguration(String ipConfiguration)
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#ip_configuration DataflowJob#ip_configuration}
- Parameters:
ipConfiguration- The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE". This parameter is required.- Returns:
this
-
kmsKeyName
@Stability(Stable) public DataflowJob.Builder kmsKeyName(String kmsKeyName)
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#kms_key_name DataflowJob#kms_key_name}
- Parameters:
kmsKeyName- The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY. This parameter is required.- Returns:
this
-
labels
@Stability(Stable) public DataflowJob.Builder labels(Map<String,String> labels)
User labels to be specified for the job.Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#labels DataflowJob#labels}
- Parameters:
labels- User labels to be specified for the job. This parameter is required.- Returns:
this
-
machineType
@Stability(Stable) public DataflowJob.Builder machineType(String machineType)
The machine type to use for the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#machine_type DataflowJob#machine_type}
- Parameters:
machineType- The machine type to use for the job. This parameter is required.- Returns:
this
-
maxWorkers
@Stability(Stable) public DataflowJob.Builder maxWorkers(Number maxWorkers)
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#max_workers DataflowJob#max_workers}
- Parameters:
maxWorkers- The number of workers permitted to work on the job. More workers may improve processing speed at additional cost. This parameter is required.- Returns:
this
-
network
@Stability(Stable) public DataflowJob.Builder network(String network)
The network to which VMs will be assigned. If it is not provided, "default" will be used.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#network DataflowJob#network}
- Parameters:
network- The network to which VMs will be assigned. If it is not provided, "default" will be used. This parameter is required.- Returns:
this
-
onDelete
@Stability(Stable) public DataflowJob.Builder onDelete(String onDelete)
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#on_delete DataflowJob#on_delete}
- Parameters:
onDelete- One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy. This parameter is required.- Returns:
this
-
parameters
@Stability(Stable) public DataflowJob.Builder parameters(Map<String,String> parameters)
Key/Value pairs to be passed to the Dataflow job (as used in the template).Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#parameters DataflowJob#parameters}
- Parameters:
parameters- Key/Value pairs to be passed to the Dataflow job (as used in the template). This parameter is required.- Returns:
this
-
project
@Stability(Stable) public DataflowJob.Builder project(String project)
The project in which the resource belongs.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#project DataflowJob#project}
- Parameters:
project- The project in which the resource belongs. This parameter is required.- Returns:
this
-
region
@Stability(Stable) public DataflowJob.Builder region(String region)
The region in which the created job should run.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#region DataflowJob#region}
- Parameters:
region- The region in which the created job should run. This parameter is required.- Returns:
this
-
serviceAccountEmail
@Stability(Stable) public DataflowJob.Builder serviceAccountEmail(String serviceAccountEmail)
The Service Account email used to create the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#service_account_email DataflowJob#service_account_email}
- Parameters:
serviceAccountEmail- The Service Account email used to create the job. This parameter is required.- Returns:
this
-
skipWaitOnJobTermination
@Stability(Stable) public DataflowJob.Builder skipWaitOnJobTermination(Boolean skipWaitOnJobTermination)
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#skip_wait_on_job_termination DataflowJob#skip_wait_on_job_termination}
- Parameters:
skipWaitOnJobTermination- If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. This parameter is required.- Returns:
this
-
skipWaitOnJobTermination
@Stability(Stable) public DataflowJob.Builder skipWaitOnJobTermination(com.hashicorp.cdktf.IResolvable skipWaitOnJobTermination)
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#skip_wait_on_job_termination DataflowJob#skip_wait_on_job_termination}
- Parameters:
skipWaitOnJobTermination- If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. This parameter is required.- Returns:
this
-
subnetwork
@Stability(Stable) public DataflowJob.Builder subnetwork(String subnetwork)
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#subnetwork DataflowJob#subnetwork}
- Parameters:
subnetwork- The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". This parameter is required.- Returns:
this
-
timeouts
@Stability(Stable) public DataflowJob.Builder timeouts(DataflowJobTimeouts timeouts)
timeouts block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#timeouts DataflowJob#timeouts}
- Parameters:
timeouts- timeouts block. This parameter is required.- Returns:
this
-
transformNameMapping
@Stability(Stable) public DataflowJob.Builder transformNameMapping(Map<String,String> transformNameMapping)
Only applicable when updating a pipeline.Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#transform_name_mapping DataflowJob#transform_name_mapping}
- Parameters:
transformNameMapping- Only applicable when updating a pipeline. This parameter is required.- Returns:
this
-
zone
@Stability(Stable) public DataflowJob.Builder zone(String zone)
The zone in which the created job should run. If it is not provided, the provider zone is used.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#zone DataflowJob#zone}
- Parameters:
zone- The zone in which the created job should run. If it is not provided, the provider zone is used. This parameter is required.- Returns:
this
-
build
@Stability(Stable) public DataflowJob build()
- Specified by:
buildin interfacesoftware.amazon.jsii.Builder<DataflowJob>- Returns:
- a newly built instance of
DataflowJob.
-
-