Interface DataprocJobConfig
-
- All Superinterfaces:
software.amazon.jsii.JsiiSerializable,com.hashicorp.cdktf.TerraformMetaArguments
- All Known Implementing Classes:
DataprocJobConfig.Jsii$Proxy
@Generated(value="jsii-pacmak/1.102.0 (build e354887)", date="2024-08-31T03:59:20.736Z") @Stability(Stable) public interface DataprocJobConfig extends software.amazon.jsii.JsiiSerializable, com.hashicorp.cdktf.TerraformMetaArguments
-
-
Nested Class Summary
Nested Classes Modifier and Type Interface Description static classDataprocJobConfig.BuilderA builder forDataprocJobConfigstatic classDataprocJobConfig.Jsii$ProxyAn implementation forDataprocJobConfig
-
Method Summary
All Methods Static Methods Instance Methods Abstract Methods Default Methods Modifier and Type Method Description static DataprocJobConfig.Builderbuilder()default ObjectgetForceDelete()By default, you can only delete inactive jobs within Dataproc.default DataprocJobHadoopConfiggetHadoopConfig()hadoop_config block.default DataprocJobHiveConfiggetHiveConfig()hive_config block.default StringgetId()Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#id DataprocJob#id}.default Map<String,String>getLabels()Optional.default DataprocJobPigConfiggetPigConfig()pig_config block.DataprocJobPlacementgetPlacement()placement block.default DataprocJobPrestoConfiggetPrestoConfig()presto_config block.default StringgetProject()The project in which the cluster can be found and jobs subsequently run against.default DataprocJobPysparkConfiggetPysparkConfig()pyspark_config block.default DataprocJobReferencegetReference()reference block.default StringgetRegion()The Cloud Dataproc region.default DataprocJobSchedulinggetScheduling()scheduling block.default DataprocJobSparkConfiggetSparkConfig()spark_config block.default DataprocJobSparksqlConfiggetSparksqlConfig()sparksql_config block.default DataprocJobTimeoutsgetTimeouts()timeouts block.
-
-
-
Method Detail
-
getPlacement
@Stability(Stable) @NotNull DataprocJobPlacement getPlacement()
placement block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#placement DataprocJob#placement}
-
getForceDelete
@Stability(Stable) @Nullable default Object getForceDelete()
By default, you can only delete inactive jobs within Dataproc.Setting this to true, and calling destroy, will ensure that the job is first cancelled before issuing the delete. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#force_delete DataprocJob#force_delete}
-
getHadoopConfig
@Stability(Stable) @Nullable default DataprocJobHadoopConfig getHadoopConfig()
hadoop_config block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#hadoop_config DataprocJob#hadoop_config}
-
getHiveConfig
@Stability(Stable) @Nullable default DataprocJobHiveConfig getHiveConfig()
hive_config block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#hive_config DataprocJob#hive_config}
-
getId
@Stability(Stable) @Nullable default String getId()
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#id DataprocJob#id}.Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
-
getLabels
@Stability(Stable) @Nullable default Map<String,String> getLabels()
Optional. The labels to associate with this job.**Note**: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#labels DataprocJob#labels}
-
getPigConfig
@Stability(Stable) @Nullable default DataprocJobPigConfig getPigConfig()
pig_config block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#pig_config DataprocJob#pig_config}
-
getPrestoConfig
@Stability(Stable) @Nullable default DataprocJobPrestoConfig getPrestoConfig()
presto_config block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#presto_config DataprocJob#presto_config}
-
getProject
@Stability(Stable) @Nullable default String getProject()
The project in which the cluster can be found and jobs subsequently run against.If it is not provided, the provider project is used. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#project DataprocJob#project}
-
getPysparkConfig
@Stability(Stable) @Nullable default DataprocJobPysparkConfig getPysparkConfig()
pyspark_config block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#pyspark_config DataprocJob#pyspark_config}
-
getReference
@Stability(Stable) @Nullable default DataprocJobReference getReference()
reference block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#reference DataprocJob#reference}
-
getRegion
@Stability(Stable) @Nullable default String getRegion()
The Cloud Dataproc region.This essentially determines which clusters are available for this job to be submitted to. If not specified, defaults to global. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#region DataprocJob#region}
-
getScheduling
@Stability(Stable) @Nullable default DataprocJobScheduling getScheduling()
scheduling block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#scheduling DataprocJob#scheduling}
-
getSparkConfig
@Stability(Stable) @Nullable default DataprocJobSparkConfig getSparkConfig()
spark_config block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#spark_config DataprocJob#spark_config}
-
getSparksqlConfig
@Stability(Stable) @Nullable default DataprocJobSparksqlConfig getSparksqlConfig()
sparksql_config block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#sparksql_config DataprocJob#sparksql_config}
-
getTimeouts
@Stability(Stable) @Nullable default DataprocJobTimeouts getTimeouts()
timeouts block.Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_job#timeouts DataprocJob#timeouts}
-
builder
@Stability(Stable) static DataprocJobConfig.Builder builder()
- Returns:
- a
DataprocJobConfig.BuilderofDataprocJobConfig
-
-