Class DataprocWorkflowTemplateJobsSparkJob.Builder
- java.lang.Object
-
- com.hashicorp.cdktf.providers.google.dataproc_workflow_template.DataprocWorkflowTemplateJobsSparkJob.Builder
-
- All Implemented Interfaces:
software.amazon.jsii.Builder<DataprocWorkflowTemplateJobsSparkJob>
- Enclosing interface:
- DataprocWorkflowTemplateJobsSparkJob
@Stability(Stable) public static final class DataprocWorkflowTemplateJobsSparkJob.Builder extends Object implements software.amazon.jsii.Builder<DataprocWorkflowTemplateJobsSparkJob>
A builder forDataprocWorkflowTemplateJobsSparkJob
-
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
-
-
-
Method Detail
-
archiveUris
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob.Builder archiveUris(List<String> archiveUris)
Sets the value ofDataprocWorkflowTemplateJobsSparkJob.getArchiveUris()- Parameters:
archiveUris- Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#archive_uris DataprocWorkflowTemplate#archive_uris}- Returns:
this
-
args
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob.Builder args(List<String> args)
Sets the value ofDataprocWorkflowTemplateJobsSparkJob.getArgs()- Parameters:
args- Optional. The arguments to pass to the driver. Do not include arguments, such as `--conf`, that can be set as job properties, since a collision may occur that causes an incorrect job submission. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#args DataprocWorkflowTemplate#args}- Returns:
this
-
fileUris
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob.Builder fileUris(List<String> fileUris)
Sets the value ofDataprocWorkflowTemplateJobsSparkJob.getFileUris()- Parameters:
fileUris- Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#file_uris DataprocWorkflowTemplate#file_uris}- Returns:
this
-
jarFileUris
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob.Builder jarFileUris(List<String> jarFileUris)
Sets the value ofDataprocWorkflowTemplateJobsSparkJob.getJarFileUris()- Parameters:
jarFileUris- Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Spark driver and tasks. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#jar_file_uris DataprocWorkflowTemplate#jar_file_uris}- Returns:
this
-
loggingConfig
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob.Builder loggingConfig(DataprocWorkflowTemplateJobsSparkJobLoggingConfig loggingConfig)
Sets the value ofDataprocWorkflowTemplateJobsSparkJob.getLoggingConfig()- Parameters:
loggingConfig- logging_config block. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#logging_config DataprocWorkflowTemplate#logging_config}- Returns:
this
-
mainClass
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob.Builder mainClass(String mainClass)
Sets the value ofDataprocWorkflowTemplateJobsSparkJob.getMainClass()- Parameters:
mainClass- The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in `jar_file_uris`. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#main_class DataprocWorkflowTemplate#main_class}- Returns:
this
-
mainJarFileUri
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob.Builder mainJarFileUri(String mainJarFileUri)
Sets the value ofDataprocWorkflowTemplateJobsSparkJob.getMainJarFileUri()- Parameters:
mainJarFileUri- The HCFS URI of the jar file that contains the main class. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#main_jar_file_uri DataprocWorkflowTemplate#main_jar_file_uri}- Returns:
this
-
properties
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob.Builder properties(Map<String,String> properties)
Sets the value ofDataprocWorkflowTemplateJobsSparkJob.getProperties()- Parameters:
properties- Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#properties DataprocWorkflowTemplate#properties}- Returns:
this
-
build
@Stability(Stable) public DataprocWorkflowTemplateJobsSparkJob build()
Builds the configured instance.- Specified by:
buildin interfacesoftware.amazon.jsii.Builder<DataprocWorkflowTemplateJobsSparkJob>- Returns:
- a new instance of
DataprocWorkflowTemplateJobsSparkJob - Throws:
NullPointerException- if any required attribute was not provided
-
-