Class DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Jsii$Proxy
- java.lang.Object
-
- software.amazon.jsii.JsiiObject
-
- com.hashicorp.cdktf.providers.google.data_pipeline_pipeline.DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Jsii$Proxy
-
- All Implemented Interfaces:
DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment,software.amazon.jsii.JsiiSerializable
- Enclosing interface:
- DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
@Stability(Stable) @Internal public static final class DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Jsii$Proxy extends software.amazon.jsii.JsiiObject implements DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
An implementation forDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from class software.amazon.jsii.JsiiObject
software.amazon.jsii.JsiiObject.InitializationMode
-
Nested classes/interfaces inherited from interface com.hashicorp.cdktf.providers.google.data_pipeline_pipeline.DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder, DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Jsii$Proxy
-
-
Constructor Summary
Constructors Modifier Constructor Description protectedJsii$Proxy(DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder builder)Constructor that initializes the object based on literal property values passed by theDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder.protectedJsii$Proxy(software.amazon.jsii.JsiiObjectRef objRef)Constructor that initializes the object based on values retrieved from the JsiiObject.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description com.fasterxml.jackson.databind.JsonNode$jsii$toJson()booleanequals(Object o)List<String>getAdditionalExperiments()Additional experiment flags for the job.Map<String,String>getAdditionalUserLabels()Additional user labels to be specified for the job.ObjectgetBypassTempDirValidation()Whether to bypass the safety checks for the job's temporary directory.ObjectgetEnableStreamingEngine()Whether to enable Streaming Engine for the job.StringgetIpConfiguration()Configuration for VM IPs.StringgetKmsKeyName()'Name for the Cloud KMS key for the job.StringgetMachineType()The machine type to use for the job.NumbergetMaxWorkers()The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.StringgetNetwork()Network to which VMs will be assigned.NumbergetNumWorkers()The initial number of Compute Engine instances for the job.StringgetServiceAccountEmail()The email address of the service account to run the job as.StringgetSubnetwork()Subnetwork to which VMs will be assigned, if desired.StringgetTempLocation()The Cloud Storage path to use for temporary files.StringgetWorkerRegion()The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.StringgetWorkerZone()The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g.StringgetZone()The Compute Engine availability zone for launching worker instances to run your pipeline.inthashCode()
-
-
-
Constructor Detail
-
Jsii$Proxy
protected Jsii$Proxy(software.amazon.jsii.JsiiObjectRef objRef)
Constructor that initializes the object based on values retrieved from the JsiiObject.- Parameters:
objRef- Reference to the JSII managed object.
-
Jsii$Proxy
protected Jsii$Proxy(DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder builder)
Constructor that initializes the object based on literal property values passed by theDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.Builder.
-
-
Method Detail
-
getAdditionalExperiments
public final List<String> getAdditionalExperiments()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentAdditional experiment flags for the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#additional_experiments DataPipelinePipeline#additional_experiments}
- Specified by:
getAdditionalExperimentsin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getAdditionalUserLabels
public final Map<String,String> getAdditionalUserLabels()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentAdditional user labels to be specified for the job.Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. 'Example: { "name": "wrench", "mass": "1kg", "count": "3" }.' 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#additional_user_labels DataPipelinePipeline#additional_user_labels}
- Specified by:
getAdditionalUserLabelsin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getBypassTempDirValidation
public final Object getBypassTempDirValidation()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentWhether to bypass the safety checks for the job's temporary directory. Use with caution.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#bypass_temp_dir_validation DataPipelinePipeline#bypass_temp_dir_validation}
- Specified by:
getBypassTempDirValidationin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getEnableStreamingEngine
public final Object getEnableStreamingEngine()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentWhether to enable Streaming Engine for the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#enable_streaming_engine DataPipelinePipeline#enable_streaming_engine}
- Specified by:
getEnableStreamingEnginein interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getIpConfiguration
public final String getIpConfiguration()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentConfiguration for VM IPs. https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#WorkerIPAddressConfiguration Possible values: ["WORKER_IP_UNSPECIFIED", "WORKER_IP_PUBLIC", "WORKER_IP_PRIVATE"].Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#ip_configuration DataPipelinePipeline#ip_configuration}
- Specified by:
getIpConfigurationin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getKmsKeyName
public final String getKmsKeyName()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment'Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/'.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#kms_key_name DataPipelinePipeline#kms_key_name}
- Specified by:
getKmsKeyNamein interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getMachineType
public final String getMachineType()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentThe machine type to use for the job. Defaults to the value from the template if not specified.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#machine_type DataPipelinePipeline#machine_type}
- Specified by:
getMachineTypein interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getMaxWorkers
public final Number getMaxWorkers()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentThe maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#max_workers DataPipelinePipeline#max_workers}
- Specified by:
getMaxWorkersin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getNetwork
public final String getNetwork()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentNetwork to which VMs will be assigned. If empty or unspecified, the service will use the network "default".Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#network DataPipelinePipeline#network}
- Specified by:
getNetworkin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getNumWorkers
public final Number getNumWorkers()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentThe initial number of Compute Engine instances for the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#num_workers DataPipelinePipeline#num_workers}
- Specified by:
getNumWorkersin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getServiceAccountEmail
public final String getServiceAccountEmail()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentThe email address of the service account to run the job as.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#service_account_email DataPipelinePipeline#service_account_email}
- Specified by:
getServiceAccountEmailin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getSubnetwork
public final String getSubnetwork()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentSubnetwork to which VMs will be assigned, if desired.You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#subnetwork DataPipelinePipeline#subnetwork}
- Specified by:
getSubnetworkin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getTempLocation
public final String getTempLocation()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentThe Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#temp_location DataPipelinePipeline#temp_location}
- Specified by:
getTempLocationin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getWorkerRegion
public final String getWorkerRegion()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentThe Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with workerZone. If neither workerRegion nor workerZone is specified, default to the control plane's region.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#worker_region DataPipelinePipeline#worker_region}
- Specified by:
getWorkerRegionin interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getWorkerZone
public final String getWorkerZone()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentThe Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with workerRegion. If neither workerRegion nor workerZone is specified, a zone in the control plane's region is chosen based on available capacity. If both workerZone and zone are set, workerZone takes precedence.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#worker_zone DataPipelinePipeline#worker_zone}
- Specified by:
getWorkerZonein interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
getZone
public final String getZone()
Description copied from interface:DataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentThe Compute Engine availability zone for launching worker instances to run your pipeline.In the future, workerZone will take precedence. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/data_pipeline_pipeline#zone DataPipelinePipeline#zone}
- Specified by:
getZonein interfaceDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
-
$jsii$toJson
@Internal public com.fasterxml.jackson.databind.JsonNode $jsii$toJson()
- Specified by:
$jsii$toJsonin interfacesoftware.amazon.jsii.JsiiSerializable
-
-