Class DataprocWorkflowTemplateJobsHadoopJob.Builder
- java.lang.Object
-
- com.hashicorp.cdktf.providers.google.dataproc_workflow_template.DataprocWorkflowTemplateJobsHadoopJob.Builder
-
- All Implemented Interfaces:
software.amazon.jsii.Builder<DataprocWorkflowTemplateJobsHadoopJob>
- Enclosing interface:
- DataprocWorkflowTemplateJobsHadoopJob
@Stability(Stable) public static final class DataprocWorkflowTemplateJobsHadoopJob.Builder extends Object implements software.amazon.jsii.Builder<DataprocWorkflowTemplateJobsHadoopJob>
A builder forDataprocWorkflowTemplateJobsHadoopJob
-
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
-
-
-
Method Detail
-
archiveUris
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob.Builder archiveUris(List<String> archiveUris)
Sets the value ofDataprocWorkflowTemplateJobsHadoopJob.getArchiveUris()- Parameters:
archiveUris- Optional. HCFS URIs of archives to be extracted in the working directory of Hadoop drivers and tasks. Supported file types: .jar, .tar, .tar.gz, .tgz, or .zip. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#archive_uris DataprocWorkflowTemplate#archive_uris}- Returns:
this
-
args
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob.Builder args(List<String> args)
Sets the value ofDataprocWorkflowTemplateJobsHadoopJob.getArgs()- Parameters:
args- Optional. The arguments to pass to the driver. Do not include arguments, such as `-libjars` or `-Dfoo=bar`, that can be set as job properties, since a collision may occur that causes an incorrect job submission. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#args DataprocWorkflowTemplate#args}- Returns:
this
-
fileUris
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob.Builder fileUris(List<String> fileUris)
Sets the value ofDataprocWorkflowTemplateJobsHadoopJob.getFileUris()- Parameters:
fileUris- Optional. HCFS (Hadoop Compatible Filesystem) URIs of files to be copied to the working directory of Hadoop drivers and distributed tasks. Useful for naively parallel tasks. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#file_uris DataprocWorkflowTemplate#file_uris}- Returns:
this
-
jarFileUris
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob.Builder jarFileUris(List<String> jarFileUris)
Sets the value ofDataprocWorkflowTemplateJobsHadoopJob.getJarFileUris()- Parameters:
jarFileUris- Optional. Jar file URIs to add to the CLASSPATHs of the Hadoop driver and tasks. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#jar_file_uris DataprocWorkflowTemplate#jar_file_uris}- Returns:
this
-
loggingConfig
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob.Builder loggingConfig(DataprocWorkflowTemplateJobsHadoopJobLoggingConfig loggingConfig)
Sets the value ofDataprocWorkflowTemplateJobsHadoopJob.getLoggingConfig()- Parameters:
loggingConfig- logging_config block. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#logging_config DataprocWorkflowTemplate#logging_config}- Returns:
this
-
mainClass
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob.Builder mainClass(String mainClass)
Sets the value ofDataprocWorkflowTemplateJobsHadoopJob.getMainClass()- Parameters:
mainClass- The name of the driver's main class. The jar file containing the class must be in the default CLASSPATH or specified in `jar_file_uris`. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#main_class DataprocWorkflowTemplate#main_class}- Returns:
this
-
mainJarFileUri
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob.Builder mainJarFileUri(String mainJarFileUri)
Sets the value ofDataprocWorkflowTemplateJobsHadoopJob.getMainJarFileUri()- Parameters:
mainJarFileUri- The HCFS URI of the jar file containing the main class. Examples: 'gs://foo-bucket/analytics-binaries/extract-useful-metrics-mr.jar' 'hdfs:/tmp/test-samples/custom-wordcount.jar' 'file:///home/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar'. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#main_jar_file_uri DataprocWorkflowTemplate#main_jar_file_uri}- Returns:
this
-
properties
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob.Builder properties(Map<String,String> properties)
Sets the value ofDataprocWorkflowTemplateJobsHadoopJob.getProperties()- Parameters:
properties- Optional. A mapping of property names to values, used to configure Hadoop. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/hadoop/conf/*-site and classes in user code. Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google/5.43.1/docs/resources/dataproc_workflow_template#properties DataprocWorkflowTemplate#properties}- Returns:
this
-
build
@Stability(Stable) public DataprocWorkflowTemplateJobsHadoopJob build()
Builds the configured instance.- Specified by:
buildin interfacesoftware.amazon.jsii.Builder<DataprocWorkflowTemplateJobsHadoopJob>- Returns:
- a new instance of
DataprocWorkflowTemplateJobsHadoopJob - Throws:
NullPointerException- if any required attribute was not provided
-
-