public class BigQueryDirectDataWriterContextFactory extends Object implements DataWriterContextFactory<org.apache.spark.sql.catalyst.InternalRow>
| Constructor and Description |
|---|
BigQueryDirectDataWriterContextFactory(BigQueryClientFactory writeClientFactory,
String tablePath,
org.apache.spark.sql.types.StructType sparkSchema,
com.google.cloud.bigquery.storage.v1.ProtoSchema protoSchema,
boolean ignoreInputs,
com.google.api.gax.retrying.RetrySettings bigqueryDataWriterHelperRetrySettings,
com.google.common.base.Optional<String> traceId,
boolean writeAtLeastOnce) |
| Modifier and Type | Method and Description |
|---|---|
DataWriterContext<org.apache.spark.sql.catalyst.InternalRow> |
createDataWriterContext(int partitionId,
long taskId,
long epochId)
If ignoreInputs is true, return a NoOpDataWriter, a stub class that performs no operations upon
the call of its methods; otherwise return BigQueryDataWriter.
|
public BigQueryDirectDataWriterContextFactory(BigQueryClientFactory writeClientFactory, String tablePath, org.apache.spark.sql.types.StructType sparkSchema, com.google.cloud.bigquery.storage.v1.ProtoSchema protoSchema, boolean ignoreInputs, com.google.api.gax.retrying.RetrySettings bigqueryDataWriterHelperRetrySettings, com.google.common.base.Optional<String> traceId, boolean writeAtLeastOnce)
public DataWriterContext<org.apache.spark.sql.catalyst.InternalRow> createDataWriterContext(int partitionId, long taskId, long epochId)
createDataWriterContext in interface DataWriterContextFactory<org.apache.spark.sql.catalyst.InternalRow>partitionId - The partitionId of the DataWriter to be createdtaskId - the taskIdepochId - the epochIdNoOpDataWriterContext,
BigQueryDirectDataWriterContextCopyright © 2024. All rights reserved.