Interface DataflowJobConfig
-
- All Superinterfaces:
software.amazon.jsii.JsiiSerializable,com.hashicorp.cdktf.TerraformMetaArguments
- All Known Implementing Classes:
DataflowJobConfig.Jsii$Proxy
@Generated(value="jsii-pacmak/1.102.0 (build e354887)", date="2024-08-31T03:59:20.564Z") @Stability(Stable) public interface DataflowJobConfig extends software.amazon.jsii.JsiiSerializable, com.hashicorp.cdktf.TerraformMetaArguments
-
-
Nested Class Summary
Nested Classes Modifier and Type Interface Description static classDataflowJobConfig.BuilderA builder forDataflowJobConfigstatic classDataflowJobConfig.Jsii$ProxyAn implementation forDataflowJobConfig
-
Method Summary
All Methods Static Methods Instance Methods Abstract Methods Default Methods Modifier and Type Method Description static DataflowJobConfig.Builderbuilder()default List<String>getAdditionalExperiments()List of experiments that should be used by the job.default ObjectgetEnableStreamingEngine()Indicates if the job should use the streaming engine feature.default StringgetId()Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#id DataflowJob#id}.default StringgetIpConfiguration()The configuration for VM IPs.default StringgetKmsKeyName()The name for the Cloud KMS key for the job.default Map<String,String>getLabels()User labels to be specified for the job.default StringgetMachineType()The machine type to use for the job.default NumbergetMaxWorkers()The number of workers permitted to work on the job.StringgetName()A unique name for the resource, required by Dataflow.default StringgetNetwork()The network to which VMs will be assigned.default StringgetOnDelete()One of "drain" or "cancel".default Map<String,String>getParameters()Key/Value pairs to be passed to the Dataflow job (as used in the template).default StringgetProject()The project in which the resource belongs.default StringgetRegion()The region in which the created job should run.default StringgetServiceAccountEmail()The Service Account email used to create the job.default ObjectgetSkipWaitOnJobTermination()If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.default StringgetSubnetwork()The subnetwork to which VMs will be assigned.StringgetTempGcsLocation()A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.StringgetTemplateGcsPath()The Google Cloud Storage path to the Dataflow job template.default DataflowJobTimeoutsgetTimeouts()timeouts block.default Map<String,String>getTransformNameMapping()Only applicable when updating a pipeline.default StringgetZone()The zone in which the created job should run.
-
-
-
Method Detail
-
getName
@Stability(Stable) @NotNull String getName()
A unique name for the resource, required by Dataflow.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#name DataflowJob#name}
-
getTempGcsLocation
@Stability(Stable) @NotNull String getTempGcsLocation()
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#temp_gcs_location DataflowJob#temp_gcs_location}
-
getTemplateGcsPath
@Stability(Stable) @NotNull String getTemplateGcsPath()
The Google Cloud Storage path to the Dataflow job template.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#template_gcs_path DataflowJob#template_gcs_path}
-
getAdditionalExperiments
@Stability(Stable) @Nullable default List<String> getAdditionalExperiments()
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#additional_experiments DataflowJob#additional_experiments}
-
getEnableStreamingEngine
@Stability(Stable) @Nullable default Object getEnableStreamingEngine()
Indicates if the job should use the streaming engine feature.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#enable_streaming_engine DataflowJob#enable_streaming_engine}
-
getId
@Stability(Stable) @Nullable default String getId()
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#id DataflowJob#id}.Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
-
getIpConfiguration
@Stability(Stable) @Nullable default String getIpConfiguration()
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#ip_configuration DataflowJob#ip_configuration}
-
getKmsKeyName
@Stability(Stable) @Nullable default String getKmsKeyName()
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#kms_key_name DataflowJob#kms_key_name}
-
getLabels
@Stability(Stable) @Nullable default Map<String,String> getLabels()
User labels to be specified for the job.Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#labels DataflowJob#labels}
-
getMachineType
@Stability(Stable) @Nullable default String getMachineType()
The machine type to use for the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#machine_type DataflowJob#machine_type}
-
getMaxWorkers
@Stability(Stable) @Nullable default Number getMaxWorkers()
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#max_workers DataflowJob#max_workers}
-
getNetwork
@Stability(Stable) @Nullable default String getNetwork()
The network to which VMs will be assigned. If it is not provided, "default" will be used.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#network DataflowJob#network}
-
getOnDelete
@Stability(Stable) @Nullable default String getOnDelete()
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#on_delete DataflowJob#on_delete}
-
getParameters
@Stability(Stable) @Nullable default Map<String,String> getParameters()
Key/Value pairs to be passed to the Dataflow job (as used in the template).Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#parameters DataflowJob#parameters}
-
getProject
@Stability(Stable) @Nullable default String getProject()
The project in which the resource belongs.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#project DataflowJob#project}
-
getRegion
@Stability(Stable) @Nullable default String getRegion()
The region in which the created job should run.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#region DataflowJob#region}
-
getServiceAccountEmail
@Stability(Stable) @Nullable default String getServiceAccountEmail()
The Service Account email used to create the job.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#service_account_email DataflowJob#service_account_email}
-
getSkipWaitOnJobTermination
@Stability(Stable) @Nullable default Object getSkipWaitOnJobTermination()
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#skip_wait_on_job_termination DataflowJob#skip_wait_on_job_termination}
-
getSubnetwork
@Stability(Stable) @Nullable default String getSubnetwork()
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#subnetwork DataflowJob#subnetwork}
-
getTimeouts
@Stability(Stable) @Nullable default DataflowJobTimeouts getTimeouts()
timeouts block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#timeouts DataflowJob#timeouts}
-
getTransformNameMapping
@Stability(Stable) @Nullable default Map<String,String> getTransformNameMapping()
Only applicable when updating a pipeline.Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#transform_name_mapping DataflowJob#transform_name_mapping}
-
getZone
@Stability(Stable) @Nullable default String getZone()
The zone in which the created job should run. If it is not provided, the provider zone is used.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataflow_job#zone DataflowJob#zone}
-
builder
@Stability(Stable) static DataflowJobConfig.Builder builder()
- Returns:
- a
DataflowJobConfig.BuilderofDataflowJobConfig
-
-