public interface BigQuerySourceOrBuilder
extends com.google.protobuf.MessageOrBuilder
| Modifier and Type | Method and Description |
|---|---|
String |
getDataSchema()
The schema to use when parsing the data from the source.
|
com.google.protobuf.ByteString |
getDataSchemaBytes()
The schema to use when parsing the data from the source.
|
String |
getDatasetId()
Required.
|
com.google.protobuf.ByteString |
getDatasetIdBytes()
Required.
|
String |
getGcsStagingDir()
Intermediate Cloud Storage directory used for the import with a length
limit of 2,000 characters.
|
com.google.protobuf.ByteString |
getGcsStagingDirBytes()
Intermediate Cloud Storage directory used for the import with a length
limit of 2,000 characters.
|
BigQuerySource.PartitionCase |
getPartitionCase() |
com.google.type.Date |
getPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
|
com.google.type.DateOrBuilder |
getPartitionDateOrBuilder()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
|
String |
getProjectId()
The project ID (can be project # or ID) that the BigQuery source is in with
a length limit of 128 characters.
|
com.google.protobuf.ByteString |
getProjectIdBytes()
The project ID (can be project # or ID) that the BigQuery source is in with
a length limit of 128 characters.
|
String |
getTableId()
Required.
|
com.google.protobuf.ByteString |
getTableIdBytes()
Required.
|
boolean |
hasPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
|
findInitializationErrors, getAllFields, getDefaultInstanceForType, getDescriptorForType, getField, getInitializationErrorString, getOneofFieldDescriptor, getRepeatedField, getRepeatedFieldCount, getUnknownFields, hasField, hasOneofboolean hasPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format. Only supported in [ImportProductsRequest][google.cloud.retail.v2.ImportProductsRequest].
.google.type.Date partition_date = 6;com.google.type.Date getPartitionDate()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format. Only supported in [ImportProductsRequest][google.cloud.retail.v2.ImportProductsRequest].
.google.type.Date partition_date = 6;com.google.type.DateOrBuilder getPartitionDateOrBuilder()
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format. Only supported in [ImportProductsRequest][google.cloud.retail.v2.ImportProductsRequest].
.google.type.Date partition_date = 6;String getProjectId()
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
string project_id = 5;com.google.protobuf.ByteString getProjectIdBytes()
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
string project_id = 5;String getDatasetId()
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
string dataset_id = 1 [(.google.api.field_behavior) = REQUIRED];com.google.protobuf.ByteString getDatasetIdBytes()
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
string dataset_id = 1 [(.google.api.field_behavior) = REQUIRED];String getTableId()
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
string table_id = 2 [(.google.api.field_behavior) = REQUIRED];com.google.protobuf.ByteString getTableIdBytes()
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
string table_id = 2 [(.google.api.field_behavior) = REQUIRED];String getGcsStagingDir()
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
string gcs_staging_dir = 3;com.google.protobuf.ByteString getGcsStagingDirBytes()
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
string gcs_staging_dir = 3;String getDataSchema()
The schema to use when parsing the data from the source. Supported values for product imports: * `product` (default): One JSON [Product][google.cloud.retail.v2.Product] per line. Each product must have a valid [Product.id][google.cloud.retail.v2.Product.id]. * `product_merchant_center`: See [Importing catalog data from Merchant Center](https://cloud.google.com/retail/recommendations-ai/docs/upload-catalog#mc). Supported values for user events imports: * `user_event` (default): One JSON [UserEvent][google.cloud.retail.v2.UserEvent] per line. * `user_event_ga360`: The schema is available here: https://support.google.com/analytics/answer/3437719. * `user_event_ga4`: The schema is available here: https://support.google.com/analytics/answer/7029846. Supported values for autocomplete imports: * `suggestions` (default): One JSON completion suggestion per line. * `denylist`: One JSON deny suggestion per line. * `allowlist`: One JSON allow suggestion per line.
string data_schema = 4;com.google.protobuf.ByteString getDataSchemaBytes()
The schema to use when parsing the data from the source. Supported values for product imports: * `product` (default): One JSON [Product][google.cloud.retail.v2.Product] per line. Each product must have a valid [Product.id][google.cloud.retail.v2.Product.id]. * `product_merchant_center`: See [Importing catalog data from Merchant Center](https://cloud.google.com/retail/recommendations-ai/docs/upload-catalog#mc). Supported values for user events imports: * `user_event` (default): One JSON [UserEvent][google.cloud.retail.v2.UserEvent] per line. * `user_event_ga360`: The schema is available here: https://support.google.com/analytics/answer/3437719. * `user_event_ga4`: The schema is available here: https://support.google.com/analytics/answer/7029846. Supported values for autocomplete imports: * `suggestions` (default): One JSON completion suggestion per line. * `denylist`: One JSON deny suggestion per line. * `allowlist`: One JSON allow suggestion per line.
string data_schema = 4;BigQuerySource.PartitionCase getPartitionCase()
Copyright © 2023 Google LLC. All rights reserved.