Class DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder
- java.lang.Object
-
- com.hashicorp.cdktf.providers.google.data_pipeline_pipeline.DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder
-
- All Implemented Interfaces:
software.amazon.jsii.Builder<DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment>
- Enclosing interface:
- DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
@Stability(Stable) public static final class DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder extends Object implements software.amazon.jsii.Builder<DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment>
-
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
-
-
-
Method Detail
-
additionalExperiments
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder additionalExperiments(List<String> additionalExperiments)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getAdditionalExperiments()- Parameters:
additionalExperiments- Additional experiment flags for the job. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#additional_experiments DataPipelinePipeline#additional_experiments}- Returns:
this
-
additionalUserLabels
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder additionalUserLabels(Map<String,String> additionalUserLabels)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getAdditionalUserLabels()- Parameters:
additionalUserLabels- Additional user labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. 'Example: { "name": "wrench", "mass": "1kg", "count": "3" }.' 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#additional_user_labels DataPipelinePipeline#additional_user_labels}- Returns:
this
-
bypassTempDirValidation
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder bypassTempDirValidation(Boolean bypassTempDirValidation)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getBypassTempDirValidation()- Parameters:
bypassTempDirValidation- Whether to bypass the safety checks for the job's temporary directory. Use with caution. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#bypass_temp_dir_validation DataPipelinePipeline#bypass_temp_dir_validation}- Returns:
this
-
bypassTempDirValidation
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder bypassTempDirValidation(com.hashicorp.cdktf.IResolvable bypassTempDirValidation)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getBypassTempDirValidation()- Parameters:
bypassTempDirValidation- Whether to bypass the safety checks for the job's temporary directory. Use with caution. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#bypass_temp_dir_validation DataPipelinePipeline#bypass_temp_dir_validation}- Returns:
this
-
enableStreamingEngine
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder enableStreamingEngine(Boolean enableStreamingEngine)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getEnableStreamingEngine()- Parameters:
enableStreamingEngine- Whether to enable Streaming Engine for the job. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#enable_streaming_engine DataPipelinePipeline#enable_streaming_engine}- Returns:
this
-
enableStreamingEngine
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder enableStreamingEngine(com.hashicorp.cdktf.IResolvable enableStreamingEngine)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getEnableStreamingEngine()- Parameters:
enableStreamingEngine- Whether to enable Streaming Engine for the job. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#enable_streaming_engine DataPipelinePipeline#enable_streaming_engine}- Returns:
this
-
ipConfiguration
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder ipConfiguration(String ipConfiguration)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getIpConfiguration()- Parameters:
ipConfiguration- Configuration for VM IPs. https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#WorkerIPAddressConfiguration Possible values: ["WORKER_IP_UNSPECIFIED", "WORKER_IP_PUBLIC", "WORKER_IP_PRIVATE"]. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#ip_configuration DataPipelinePipeline#ip_configuration}- Returns:
this
-
kmsKeyName
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder kmsKeyName(String kmsKeyName)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getKmsKeyName()- Parameters:
kmsKeyName- 'Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/'. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#kms_key_name DataPipelinePipeline#kms_key_name}- Returns:
this
-
machineType
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder machineType(String machineType)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getMachineType()- Parameters:
machineType- The machine type to use for the job. Defaults to the value from the template if not specified. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#machine_type DataPipelinePipeline#machine_type}- Returns:
this
-
maxWorkers
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder maxWorkers(Number maxWorkers)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getMaxWorkers()- Parameters:
maxWorkers- The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#max_workers DataPipelinePipeline#max_workers}- Returns:
this
-
network
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder network(String network)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getNetwork()- Parameters:
network- Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default". Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#network DataPipelinePipeline#network}- Returns:
this
-
numWorkers
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder numWorkers(Number numWorkers)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getNumWorkers()- Parameters:
numWorkers- The initial number of Compute Engine instances for the job. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#num_workers DataPipelinePipeline#num_workers}- Returns:
this
-
serviceAccountEmail
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder serviceAccountEmail(String serviceAccountEmail)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getServiceAccountEmail()- Parameters:
serviceAccountEmail- The email address of the service account to run the job as. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#service_account_email DataPipelinePipeline#service_account_email}- Returns:
this
-
subnetwork
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder subnetwork(String subnetwork)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getSubnetwork()- Parameters:
subnetwork- Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#subnetwork DataPipelinePipeline#subnetwork}- Returns:
this
-
tempLocation
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder tempLocation(String tempLocation)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getTempLocation()- Parameters:
tempLocation- The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#temp_location DataPipelinePipeline#temp_location}- Returns:
this
-
workerRegion
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder workerRegion(String workerRegion)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getWorkerRegion()- Parameters:
workerRegion- The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with workerZone. If neither workerRegion nor workerZone is specified, default to the control plane's region. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#worker_region DataPipelinePipeline#worker_region}- Returns:
this
-
workerZone
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder workerZone(String workerZone)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getWorkerZone()- Parameters:
workerZone- The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with workerRegion. If neither workerRegion nor workerZone is specified, a zone in the control plane's region is chosen based on available capacity. If both workerZone and zone are set, workerZone takes precedence. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#worker_zone DataPipelinePipeline#worker_zone}- Returns:
this
-
zone
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder zone(String zone)
Sets the value ofDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.getZone()- Parameters:
zone- The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, workerZone will take precedence. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#zone DataPipelinePipeline#zone}- Returns:
this
-
build
@Stability(Stable) public DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment build()
Builds the configured instance.- Specified by:
buildin interfacesoftware.amazon.jsii.Builder<DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment>- Returns:
- a new instance of
DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment - Throws:
NullPointerException- if any required attribute was not provided
-
-