@Experimental(value=SOURCE_SINK)
public class BigQueryStorageStreamSource<T>
extends org.apache.beam.sdk.io.BoundedSource<T>
Source representing a single stream in a read session.| Modifier and Type | Class and Description |
|---|---|
static class |
BigQueryStorageStreamSource.BigQueryStorageStreamReader<T>
A
Source.Reader which reads records from a stream. |
| Modifier and Type | Method and Description |
|---|---|
static <T> BigQueryStorageStreamSource<T> |
create(com.google.cloud.bigquery.storage.v1beta1.Storage.ReadSession readSession,
com.google.cloud.bigquery.storage.v1beta1.Storage.Stream stream,
com.google.api.services.bigquery.model.TableSchema tableSchema,
org.apache.beam.sdk.transforms.SerializableFunction<SchemaAndRecord,T> parseFn,
org.apache.beam.sdk.coders.Coder<T> outputCoder,
BigQueryServices bqServices) |
BigQueryStorageStreamSource.BigQueryStorageStreamReader<T> |
createReader(org.apache.beam.sdk.options.PipelineOptions options) |
BigQueryStorageStreamSource<T> |
fromExisting(com.google.cloud.bigquery.storage.v1beta1.Storage.Stream newStream)
Creates a new source with the same properties as this one, except with a different
Storage.Stream. |
long |
getEstimatedSizeBytes(org.apache.beam.sdk.options.PipelineOptions options) |
org.apache.beam.sdk.coders.Coder<T> |
getOutputCoder() |
void |
populateDisplayData(org.apache.beam.sdk.transforms.display.DisplayData.Builder builder) |
java.util.List<? extends org.apache.beam.sdk.io.BoundedSource<T>> |
split(long desiredBundleSizeBytes,
org.apache.beam.sdk.options.PipelineOptions options) |
java.lang.String |
toString() |
public static <T> BigQueryStorageStreamSource<T> create(com.google.cloud.bigquery.storage.v1beta1.Storage.ReadSession readSession, com.google.cloud.bigquery.storage.v1beta1.Storage.Stream stream, com.google.api.services.bigquery.model.TableSchema tableSchema, org.apache.beam.sdk.transforms.SerializableFunction<SchemaAndRecord,T> parseFn, org.apache.beam.sdk.coders.Coder<T> outputCoder, BigQueryServices bqServices)
public BigQueryStorageStreamSource<T> fromExisting(com.google.cloud.bigquery.storage.v1beta1.Storage.Stream newStream)
Storage.Stream.public org.apache.beam.sdk.coders.Coder<T> getOutputCoder()
getOutputCoder in class org.apache.beam.sdk.io.Source<T>public void populateDisplayData(org.apache.beam.sdk.transforms.display.DisplayData.Builder builder)
populateDisplayData in interface org.apache.beam.sdk.transforms.display.HasDisplayDatapopulateDisplayData in class org.apache.beam.sdk.io.Source<T>public long getEstimatedSizeBytes(org.apache.beam.sdk.options.PipelineOptions options)
getEstimatedSizeBytes in class org.apache.beam.sdk.io.BoundedSource<T>public java.util.List<? extends org.apache.beam.sdk.io.BoundedSource<T>> split(long desiredBundleSizeBytes, org.apache.beam.sdk.options.PipelineOptions options)
split in class org.apache.beam.sdk.io.BoundedSource<T>public BigQueryStorageStreamSource.BigQueryStorageStreamReader<T> createReader(org.apache.beam.sdk.options.PipelineOptions options) throws java.io.IOException
createReader in class org.apache.beam.sdk.io.BoundedSource<T>java.io.IOExceptionpublic java.lang.String toString()
toString in class java.lang.Object