Interface DataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment
-
- All Superinterfaces:
software.amazon.jsii.JsiiSerializable
- All Known Implementing Classes:
DataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment.Jsii$Proxy
@Generated(value="jsii-pacmak/1.102.0 (build e354887)", date="2024-08-31T03:59:20.535Z") @Stability(Stable) public interface DataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment extends software.amazon.jsii.JsiiSerializable
-
-
Nested Class Summary
Nested Classes Modifier and Type Interface Description static classDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment.Builderstatic classDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment.Jsii$ProxyAn implementation forDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment
-
Method Summary
All Methods Static Methods Instance Methods Default Methods Modifier and Type Method Description static DataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment.Builderbuilder()default List<String>getAdditionalExperiments()Additional experiment flags for the job.default Map<String,String>getAdditionalUserLabels()Additional user labels to be specified for the job.default ObjectgetEnableStreamingEngine()Whether to enable Streaming Engine for the job.default StringgetFlexrsGoal()Set FlexRS goal for the job.default StringgetIpConfiguration()Configuration for VM IPs.default StringgetKmsKeyName()'Name for the Cloud KMS key for the job.default StringgetMachineType()The machine type to use for the job.default NumbergetMaxWorkers()The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.default StringgetNetwork()Network to which VMs will be assigned.default NumbergetNumWorkers()The initial number of Compute Engine instances for the job.default StringgetServiceAccountEmail()The email address of the service account to run the job as.default StringgetSubnetwork()Subnetwork to which VMs will be assigned, if desired.default StringgetTempLocation()The Cloud Storage path to use for temporary files.default StringgetWorkerRegion()The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.default StringgetWorkerZone()The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.default StringgetZone()The Compute Engine availability zone for launching worker instances to run your pipeline.
-
-
-
Method Detail
-
getAdditionalExperiments
@Stability(Stable) @Nullable default List<String> getAdditionalExperiments()
Additional experiment flags for the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#additional_experiments DataPipelinePipeline#additional_experiments}
-
getAdditionalUserLabels
@Stability(Stable) @Nullable default Map<String,String> getAdditionalUserLabels()
Additional user labels to be specified for the job.Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. 'Example: { "name": "wrench", "mass": "1kg", "count": "3" }.' 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#additional_user_labels DataPipelinePipeline#additional_user_labels}
-
getEnableStreamingEngine
@Stability(Stable) @Nullable default Object getEnableStreamingEngine()
Whether to enable Streaming Engine for the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#enable_streaming_engine DataPipelinePipeline#enable_streaming_engine}
-
getFlexrsGoal
@Stability(Stable) @Nullable default String getFlexrsGoal()
Set FlexRS goal for the job. https://cloud.google.com/dataflow/docs/guides/flexrs https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#FlexResourceSchedulingGoal Possible values: ["FLEXRS_UNSPECIFIED", "FLEXRS_SPEED_OPTIMIZED", "FLEXRS_COST_OPTIMIZED"].Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#flexrs_goal DataPipelinePipeline#flexrs_goal}
-
getIpConfiguration
@Stability(Stable) @Nullable default String getIpConfiguration()
Configuration for VM IPs. https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#WorkerIPAddressConfiguration Possible values: ["WORKER_IP_UNSPECIFIED", "WORKER_IP_PUBLIC", "WORKER_IP_PRIVATE"].Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#ip_configuration DataPipelinePipeline#ip_configuration}
-
getKmsKeyName
@Stability(Stable) @Nullable default String getKmsKeyName()
'Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/'.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#kms_key_name DataPipelinePipeline#kms_key_name}
-
getMachineType
@Stability(Stable) @Nullable default String getMachineType()
The machine type to use for the job. Defaults to the value from the template if not specified.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#machine_type DataPipelinePipeline#machine_type}
-
getMaxWorkers
@Stability(Stable) @Nullable default Number getMaxWorkers()
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#max_workers DataPipelinePipeline#max_workers}
-
getNetwork
@Stability(Stable) @Nullable default String getNetwork()
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#network DataPipelinePipeline#network}
-
getNumWorkers
@Stability(Stable) @Nullable default Number getNumWorkers()
The initial number of Compute Engine instances for the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#num_workers DataPipelinePipeline#num_workers}
-
getServiceAccountEmail
@Stability(Stable) @Nullable default String getServiceAccountEmail()
The email address of the service account to run the job as.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#service_account_email DataPipelinePipeline#service_account_email}
-
getSubnetwork
@Stability(Stable) @Nullable default String getSubnetwork()
Subnetwork to which VMs will be assigned, if desired.You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#subnetwork DataPipelinePipeline#subnetwork}
-
getTempLocation
@Stability(Stable) @Nullable default String getTempLocation()
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#temp_location DataPipelinePipeline#temp_location}
-
getWorkerRegion
@Stability(Stable) @Nullable default String getWorkerRegion()
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with workerZone. If neither workerRegion nor workerZone is specified, default to the control plane's region.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#worker_region DataPipelinePipeline#worker_region}
-
getWorkerZone
@Stability(Stable) @Nullable default String getWorkerZone()
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with workerRegion. If neither workerRegion nor workerZone is specified, a zone in the control plane's region is chosen based on available capacity. If both workerZone and zone are set, workerZone takes precedence.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#worker_zone DataPipelinePipeline#worker_zone}
-
getZone
@Stability(Stable) @Nullable default String getZone()
The Compute Engine availability zone for launching worker instances to run your pipeline.In the future, workerZone will take precedence. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#zone DataPipelinePipeline#zone}
-
builder
@Stability(Stable) static DataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment.Builder builder()
-
-