public interface BigQuerySourceOrBuilder
extends com.google.protobuf.MessageOrBuilder
| Modifier and Type | Method and Description |
|---|---|
String |
getDataSchema()
The schema to use when parsing the data from the source.
|
com.google.protobuf.ByteString |
getDataSchemaBytes()
The schema to use when parsing the data from the source.
|
String |
getDatasetId()
Required.
|
com.google.protobuf.ByteString |
getDatasetIdBytes()
Required.
|
String |
getGcsStagingDir()
Intermediate Cloud Storage directory used for the import with a length
limit of 2,000 characters.
|
com.google.protobuf.ByteString |
getGcsStagingDirBytes()
Intermediate Cloud Storage directory used for the import with a length
limit of 2,000 characters.
|
BigQuerySource.PartitionCase |
getPartitionCase() |
com.google.type.Date |
getPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
|
com.google.type.DateOrBuilder |
getPartitionDateOrBuilder()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
|
String |
getProjectId()
The project ID or the project number that contains the BigQuery source.
|
com.google.protobuf.ByteString |
getProjectIdBytes()
The project ID or the project number that contains the BigQuery source.
|
String |
getTableId()
Required.
|
com.google.protobuf.ByteString |
getTableIdBytes()
Required.
|
boolean |
hasPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
|
findInitializationErrors, getAllFields, getDefaultInstanceForType, getDescriptorForType, getField, getInitializationErrorString, getOneofFieldDescriptor, getRepeatedField, getRepeatedFieldCount, getUnknownFields, hasField, hasOneofboolean hasPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
.google.type.Date partition_date = 5;com.google.type.Date getPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
.google.type.Date partition_date = 5;com.google.type.DateOrBuilder getPartitionDateOrBuilder()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
.google.type.Date partition_date = 5;String getProjectId()
The project ID or the project number that contains the BigQuery source. Has a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
string project_id = 1;com.google.protobuf.ByteString getProjectIdBytes()
The project ID or the project number that contains the BigQuery source. Has a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
string project_id = 1;String getDatasetId()
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
string dataset_id = 2 [(.google.api.field_behavior) = REQUIRED];com.google.protobuf.ByteString getDatasetIdBytes()
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
string dataset_id = 2 [(.google.api.field_behavior) = REQUIRED];String getTableId()
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
string table_id = 3 [(.google.api.field_behavior) = REQUIRED];com.google.protobuf.ByteString getTableIdBytes()
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
string table_id = 3 [(.google.api.field_behavior) = REQUIRED];String getGcsStagingDir()
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
string gcs_staging_dir = 4;com.google.protobuf.ByteString getGcsStagingDirBytes()
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
string gcs_staging_dir = 4;String getDataSchema()
The schema to use when parsing the data from the source. Supported values for user event imports: * `user_event` (default): One [UserEvent][google.cloud.discoveryengine.v1beta.UserEvent] per row. Supported values for document imports: * `document` (default): One [Document][google.cloud.discoveryengine.v1beta.Document] format per row. Each document must have a valid [Document.id][google.cloud.discoveryengine.v1beta.Document.id] and one of [Document.json_data][google.cloud.discoveryengine.v1beta.Document.json_data] or [Document.struct_data][google.cloud.discoveryengine.v1beta.Document.struct_data]. * `custom`: One custom data per row in arbitrary format that conforms to the defined [Schema][google.cloud.discoveryengine.v1beta.Schema] of the data store. This can only be used by the GENERIC Data Store vertical.
string data_schema = 6;com.google.protobuf.ByteString getDataSchemaBytes()
The schema to use when parsing the data from the source. Supported values for user event imports: * `user_event` (default): One [UserEvent][google.cloud.discoveryengine.v1beta.UserEvent] per row. Supported values for document imports: * `document` (default): One [Document][google.cloud.discoveryengine.v1beta.Document] format per row. Each document must have a valid [Document.id][google.cloud.discoveryengine.v1beta.Document.id] and one of [Document.json_data][google.cloud.discoveryengine.v1beta.Document.json_data] or [Document.struct_data][google.cloud.discoveryengine.v1beta.Document.struct_data]. * `custom`: One custom data per row in arbitrary format that conforms to the defined [Schema][google.cloud.discoveryengine.v1beta.Schema] of the data store. This can only be used by the GENERIC Data Store vertical.
string data_schema = 6;BigQuerySource.PartitionCase getPartitionCase()
Copyright © 2025 Google LLC. All rights reserved.