A B C D E F G H I K L M N O P R S T U V W
All Classes All Packages
All Classes All Packages
All Classes All Packages
A
- apply(TopicPartition) - Method in class org.apache.beam.sdk.io.kafka.CheckStopReadingFnWrapper
- AUTOVALUE_CLASS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- AUTOVALUE_CLASS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
B
- build() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
-
Builds a
KafkaReadSchemaTransformConfigurationinstance. - build() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- builder() - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Instantiates a
KafkaReadSchemaTransformConfiguration.Builderinstance. - builder() - Static method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- Builder() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
C
- CheckStopReadingFn - Interface in org.apache.beam.sdk.io.kafka
- CheckStopReadingFnWrapper - Class in org.apache.beam.sdk.io.kafka
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- close() - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- commitOffsets() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Enable committing record offset.
- commitOffsetsInFinalize() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Finalized offsets are committed to Kafka.
- Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- configurationClass() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- configure(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- ConfluentSchemaRegistryDeserializerProvider<T> - Class in org.apache.beam.sdk.io.kafka
-
A
DeserializerProviderthat uses Confluent Schema Registry to resolve aDeserializers andCodergiven a subject. - consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- consistentWithEquals() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- create() - Static method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
- CREATE_TIME - org.apache.beam.sdk.io.kafka.KafkaTimestampType
- createRPCLatencyHistogram(KafkaSinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
-
Creates an Histogram metric to record RPC latency.
- createTimestampPolicy(TopicPartition, Optional<Instant>) - Method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
Creates a TimestampPolicy for a partition.
- currentWatermark - Variable in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- CustomTimestampPolicyWithLimitedDelay<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A policy for custom record timestamps where timestamps within a partition are expected to be roughly monotonically increasing with a cap on out of order event delays (say 1 minute).
- CustomTimestampPolicyWithLimitedDelay(SerializableFunction<KafkaRecord<K, V>, Instant>, Duration, Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
-
A policy for custom record timestamps where timestamps are expected to be roughly monotonically increasing with out of order event delays less than
maxDelay.
D
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- deserialize(String, byte[]) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- DeserializerProvider<T> - Interface in org.apache.beam.sdk.io.kafka
-
Provides a configured
Deserializerinstance and its associatedCoder. - DISALLOWED_CONSUMER_PROPERTIES - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIOUtils
E
- encode(KafkaRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- encode(ProducerRecord<K, V>, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- encode(TopicPartition, OutputStream) - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- equals(Object) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- ERROR_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
- expand(PCollection<KafkaSourceDescriptor>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- expand(PCollection<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- expand(PCollection<KV<KafkaSourceDescriptor, KafkaRecord<K, V>>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaCommitOffset
- expand(PCollection<ProducerRecord<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- External() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
- External() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
- externalWithMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
F
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- finish(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
- forOrdinal(int) - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
- from(KafkaReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- from(KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
G
- getAutoOffsetResetConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getBacklogCheckTime() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
-
The time at which latest offset for the partition was fetched in order to calculate backlog.
- getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getBadRecordErrorHandler() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getBadRecordRouter() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getBootstrapServers() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Sets the bootstrap servers for the Kafka consumer.
- getBootstrapServers() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getCheckpointingInterval() - Method in interface org.apache.beam.sdk.io.kafka.KafkaIO.Read.FakeFlinkPipelineOptions
- getCheckStopReadingFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getCoder(CoderRegistry) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- getCoder(CoderRegistry) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- getCoderArguments() - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- getConfluentSchemaRegistrySubject() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConfluentSchemaRegistryUrl() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConsumerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getConsumerConfigUpdates() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getConsumerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getConsumerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getConsumerPollingTimeout() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getDeserializer(Map<String, ?>, boolean) - Method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- getDeserializer(Map<String, ?>, boolean) - Method in interface org.apache.beam.sdk.io.kafka.DeserializerProvider
- getErrorHandling() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getErrorHandling() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getFileDescriptorPath() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getFormat() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getHeaders() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getKeyCoder() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getKeyDeserializerProvider() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getKeySerializer() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getKV() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getMaxNumRecords() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getMaxReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getMaxReadTimeSeconds() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getMessageBacklog() - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
-
Current backlog in messages (latest offset of the partition - last processed record offset).
- getMessageName() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getMessageName() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getNextOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getNumShards() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getOffset() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getOffsetConsumerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- getProducerConfig() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getProducerConfigUpdates() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getProducerFactoryFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getPublishTimestampFunction() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getRawBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- getRedistributeNumKeys() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getRowToRawBytesFunction(String) - Static method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- getSchema() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- getSchema() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getSinkGroupId() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getStartReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getStopReadTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTimestamp(T, Instant) - Method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
-
Returns timestamp for element being published to Kafka.
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
-
Returns record timestamp (aka event time).
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
- getTimestampForRecord(TimestampPolicy.PartitionContext, KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
- getTimestampPolicyFactory() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTimestampType() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
-
Sets the topic from which to read.
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
- getTopic() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- getTopicPartition() - Method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
- getTopicPartitions() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTopicPattern() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTopics() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.WriteRegistrar
- getValueCoder() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getValueDeserializerProvider() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getValueSerializer() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- getWatchTopicPartitionDuration() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.CustomTimestampPolicyWithLimitedDelay
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicy
-
Returns watermark for the partition.
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
- getWatermark(TimestampPolicy.PartitionContext) - Method in class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.TimestampFnPolicy
- getWatermarkFn() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- getWatermarkMillis() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- getWriteRecordsTransform() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
H
- hashCode() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecord
I
- id - Variable in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
- identifier() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- InstantDeserializer - Class in org.apache.beam.sdk.io.kafka.serialization
-
Kafka
DeserializerforInstant. - InstantDeserializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantDeserializer
- InstantSerializer - Class in org.apache.beam.sdk.io.kafka.serialization
-
Kafka
SerializerforInstant. - InstantSerializer() - Constructor for class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- isAllowDuplicates() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- isCommitOffsetsInFinalizeEnabled() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- isDynamicRead() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- isEOS() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- isRedistributed() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- isRegisterByteSizeObserverCheap(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- isRegisterByteSizeObserverCheap(ProducerRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
K
- KAFKA_READ_OVERRIDE - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
A
PTransformOverridefor runners to swapKafkaIO.Read.ReadFromKafkaViaSDFto legacy Kafka read if runners doesn't have a good support on executing unbounded Splittable DoFn. - KafkaCheckpointMark - Class in org.apache.beam.sdk.io.kafka
-
Checkpoint for a
KafkaUnboundedReader. - KafkaCheckpointMark(List<KafkaCheckpointMark.PartitionMark>, Optional<KafkaUnboundedReader<?, ?>>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- KafkaCheckpointMark.PartitionMark - Class in org.apache.beam.sdk.io.kafka
-
A tuple to hold topic, partition, and offset that comprise the checkpoint for a single partition.
- KafkaCommitOffset<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A
PTransformthat commits offsets ofKafkaRecord. - KafkaIO - Class in org.apache.beam.sdk.io.kafka
-
An unbounded source and a sink for Kafka topics.
- KafkaIO.Read<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A
PTransformto read from Kafka topics. - KafkaIO.Read.External - Class in org.apache.beam.sdk.io.kafka
-
Exposes
KafkaIO.TypedWithoutMetadataas an external transform for cross-language usage. - KafkaIO.Read.External.Configuration - Class in org.apache.beam.sdk.io.kafka
-
Parameters class to expose the Read transform to an external SDK.
- KafkaIO.Read.FakeFlinkPipelineOptions - Interface in org.apache.beam.sdk.io.kafka
- KafkaIO.ReadSourceDescriptors<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A
PTransformto read fromKafkaSourceDescriptor. - KafkaIO.TypedWithoutMetadata<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A
PTransformto read from Kafka topics. - KafkaIO.Write<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A
PTransformto write to a Kafka topic with KVs . - KafkaIO.Write.External - Class in org.apache.beam.sdk.io.kafka
-
Exposes
KafkaIO.Writeas an external transform for cross-language usage. - KafkaIO.Write.External.Configuration - Class in org.apache.beam.sdk.io.kafka
-
Parameters class to expose the Write transform to an external SDK.
- KafkaIO.WriteRecords<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A
PTransformto write to a Kafka topic with ProducerRecord's. - KafkaIOUtils - Class in org.apache.beam.sdk.io.kafka
-
Common utility functions and default configurations for
KafkaIO.ReadandKafkaIO.ReadSourceDescriptors. - KafkaIOUtils() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIOUtils
- KafkaMetrics - Interface in org.apache.beam.sdk.io.kafka
-
Stores and exports metrics for a batch of Kafka Client RPCs.
- KafkaMetrics.KafkaMetricsImpl - Class in org.apache.beam.sdk.io.kafka
-
Metrics of a batch of RPCs.
- KafkaMetrics.NoOpKafkaMetrics - Class in org.apache.beam.sdk.io.kafka
-
No-op implementation of
KafkaResults. - KafkaMetricsImpl() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
- KafkaPublishTimestampFunction<T> - Interface in org.apache.beam.sdk.io.kafka
-
An interface for providing custom timestamp for elements written to Kafka.
- KafkaReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.kafka
-
Configuration for reading from a Kafka topic.
- KafkaReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- KafkaReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.kafka
-
Builder for the
KafkaReadSchemaTransformConfiguration. - KafkaReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.kafka
- KafkaReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- KafkaReadSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.kafka
- KafkaRecord<K,V> - Class in org.apache.beam.sdk.io.kafka
-
KafkaRecord contains key and value of the record as well as metadata for the record (topic name, partition id, and offset).
- KafkaRecord(String, int, long, long, KafkaTimestampType, Headers, K, V) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecord
- KafkaRecord(String, int, long, long, KafkaTimestampType, Headers, KV<K, V>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecord
- KafkaRecordCoder<K,V> - Class in org.apache.beam.sdk.io.kafka
-
CoderforKafkaRecord. - KafkaRecordCoder(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- KafkaSchemaTransformTranslation - Class in org.apache.beam.sdk.io.kafka
- KafkaSchemaTransformTranslation() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation
- KafkaSchemaTransformTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.kafka
- KafkaSchemaTransformTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.kafka
- KafkaSinkMetrics - Class in org.apache.beam.sdk.io.kafka
-
Helper class to create per worker metrics for Kafka Sink stages.
- KafkaSinkMetrics() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
- KafkaSourceDescriptor - Class in org.apache.beam.sdk.io.kafka
-
Represents a Kafka source description.
- KafkaSourceDescriptor() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
- KafkaTimestampType - Enum in org.apache.beam.sdk.io.kafka
-
This is a copy of Kafka's
TimestampType. - KafkaWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration
- KafkaWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.kafka
- KafkaWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.kafka
- KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.kafka
- knownBuilders() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
- knownBuilders() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
L
- LOG_APPEND_TIME - org.apache.beam.sdk.io.kafka.KafkaTimestampType
- LogAppendTimePolicy(Optional<Instant>) - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.LogAppendTimePolicy
M
- METRICS_NAMESPACE - Static variable in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
N
- name - Variable in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
- NO_TIMESTAMP_TYPE - org.apache.beam.sdk.io.kafka.KafkaTimestampType
O
- of(SerializableFunction<TopicPartition, Boolean>) - Static method in class org.apache.beam.sdk.io.kafka.CheckStopReadingFnWrapper
- of(String, int, String) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, int, String, Integer) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, int, String, Integer, Map<String, ?>) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, String) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, String, Integer) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(String, String, Integer, Map<String, ?>) - Static method in class org.apache.beam.sdk.io.kafka.ConfluentSchemaRegistryDeserializerProvider
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- of(Coder<K>, Coder<V>) - Static method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- of(TopicPartition, Long, Instant, Long, Instant, List<String>) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSourceDescriptor
- org.apache.beam.sdk.io.kafka - package org.apache.beam.sdk.io.kafka
-
Transforms for reading and writing from Apache Kafka.
- org.apache.beam.sdk.io.kafka.serialization - package org.apache.beam.sdk.io.kafka.serialization
-
Kafka serializers and deserializers.
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
P
- PartitionContext() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicy.PartitionContext
- PartitionMark(String, int, long, long) - Constructor for class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.TypedWithoutMetadata
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- process(byte[], DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformProvider.ErrorFn
- ProcessingTimePolicy() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicyFactory.ProcessingTimePolicy
- ProducerRecordCoder<K,V> - Class in org.apache.beam.sdk.io.kafka
-
CoderforProducerRecord. - ProducerRecordCoder(Coder<K>, Coder<V>) - Constructor for class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
R
- read() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
Creates an uninitialized
KafkaIO.ReadPTransform. - read() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- Read() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- readBytes() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
A specific instance of uninitialized
KafkaIO.read()where key and values are bytes. - ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.ReadRegistrar
- readSourceDescriptors() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
Creates an uninitialized
KafkaIO.ReadSourceDescriptorsPTransform. - ReadSourceDescriptors() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
S
- serialize(String, Instant) - Method in class org.apache.beam.sdk.io.kafka.serialization.InstantSerializer
- setAllowDuplicates(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setAutoOffsetResetConfig(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
-
Sets the bootstrap servers for the Kafka consumer.
- setBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setCheckpointingInterval(Long) - Method in interface org.apache.beam.sdk.io.kafka.KafkaIO.Read.FakeFlinkPipelineOptions
- setCommitOffsetInFinalize(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setConfluentSchemaRegistrySubject(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setConfluentSchemaRegistryUrl(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setConsumerConfig(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setConsumerConfigUpdates(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setConsumerPollingTimeout(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setFileDescriptorPath(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setFormat(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setFormat(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setKeyDeserializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setKeySerializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- setMaxNumRecords(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setMaxReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setMaxReadTimeSeconds(Integer) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setMessageName(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setMessageName(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setProducerConfig(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- setProducerConfigUpdates(Map<String, String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setRedistribute(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setRedistributeNumKeys(Integer) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setSchema(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
- setSchema(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setStartReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setStopReadTime(Long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setSupportKafkaMetrics(boolean) - Static method in class org.apache.beam.sdk.io.kafka.KafkaSinkMetrics
- setTimestampPolicy(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration.Builder
-
Sets the topic from which to read.
- setTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider.KafkaWriteSchemaTransformConfiguration.Builder
- setTopics(List<String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setup() - Method in interface org.apache.beam.sdk.io.kafka.CheckStopReadingFn
- setValueDeserializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External.Configuration
- setValueSerializer(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External.Configuration
- structuralValue(KafkaRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- structuralValue(ProducerRecord<K, V>) - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- SUPPORTED_FORMATS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
- SUPPORTED_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.kafka.KafkaWriteSchemaTransformProvider
T
- teardown() - Method in interface org.apache.beam.sdk.io.kafka.CheckStopReadingFn
- TimestampPolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A timestamp policy to assign event time for messages in a Kafka partition and watermark for it.
- TimestampPolicy() - Constructor for class org.apache.beam.sdk.io.kafka.TimestampPolicy
- TimestampPolicy.PartitionContext - Class in org.apache.beam.sdk.io.kafka
-
The context contains state maintained in the reader for the partition.
- TimestampPolicyFactory<KeyT,ValueT> - Interface in org.apache.beam.sdk.io.kafka
-
An extendable factory to create a
TimestampPolicyfor each partition at runtime by KafkaIO reader. - TimestampPolicyFactory.LogAppendTimePolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
-
Assigns Kafka's log append time (server side ingestion time) to each record.
- TimestampPolicyFactory.ProcessingTimePolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
-
A simple policy that uses current time for event time and watermark.
- TimestampPolicyFactory.TimestampFnPolicy<K,V> - Class in org.apache.beam.sdk.io.kafka
-
Internal policy to support deprecated withTimestampFn API.
- TopicPartitionCoder - Class in org.apache.beam.sdk.io.kafka
-
The
Coderfor encoding and decodingTopicPartitionin Beam. - TopicPartitionCoder() - Constructor for class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
- toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark.PartitionMark
- toString() - Method in class org.apache.beam.sdk.io.kafka.KafkaCheckpointMark
- toString() - Method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
U
- updateConsumerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.13. Use
KafkaIO.Read.withConsumerConfigUpdates(Map)instead - updateKafkaMetrics() - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
-
Export all metrics recorded in this instance to the underlying
perWorkerMetricscontainers. - updateKafkaMetrics() - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.NoOpKafkaMetrics
- updateKafkaMetrics() - Method in interface org.apache.beam.sdk.io.kafka.KafkaMetrics
- updateProducerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Deprecated.as of version 2.13. Use
KafkaIO.Write.withProducerConfigUpdates(Map)instead. - updateProducerProperties(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Deprecated.as of version 2.13. Use
KafkaIO.WriteRecords.withProducerConfigUpdates(Map)instead. - updateSuccessfulRpcMetrics(String, Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.KafkaMetricsImpl
-
Record the rpc status and latency of a successful Kafka poll RPC call.
- updateSuccessfulRpcMetrics(String, Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaMetrics.NoOpKafkaMetrics
- updateSuccessfulRpcMetrics(String, Duration) - Method in interface org.apache.beam.sdk.io.kafka.KafkaMetrics
- URN - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Write.External
- URN_WITH_METADATA - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
- URN_WITHOUT_METADATA - Static variable in class org.apache.beam.sdk.io.kafka.KafkaIO.Read.External
V
- VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- VALID_START_OFFSET_VALUES - Static variable in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.kafka.KafkaReadSchemaTransformConfiguration
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
-
Returns the enum constant of this type with the specified name.
- values() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Writes just the values to Kafka.
- values() - Static method in enum org.apache.beam.sdk.io.kafka.KafkaTimestampType
-
Returns an array containing the constants of this enum type, in the order they are declared.
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.KafkaRecordCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.ProducerRecordCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.kafka.TopicPartitionCoder
W
- withAllowDuplicates() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withAllowDuplicates(Boolean) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Configure a
ErrorHandler.BadRecordErrorHandlerfor sending records to if they fail to serialize when being sent to Kafka. - withBadRecordErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets the bootstrap servers for the Kafka consumer.
- withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets the bootstrap servers to use for the Kafka consumer if unspecified via KafkaSourceDescriptor#getBootStrapServers()}.
- withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withBootstrapServers(String), used to keep the compatibility with old API based on KV type of element. - withBootstrapServers(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Returns a new
KafkaIO.Writetransform with Kafka producer pointing tobootstrapServers. - withCheckStopReadingFn(CheckStopReadingFn) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A custom
CheckStopReadingFnthat determines whether theReadFromKafkaDoFnshould stop reading from the givenTopicPartition. - withCheckStopReadingFn(SerializableFunction<TopicPartition, Boolean>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A custom
SerializableFunctionthat determines whether theReadFromKafkaDoFnshould stop reading from the givenTopicPartition. - withCheckStopReadingFn(CheckStopReadingFn) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
A custom
CheckStopReadingFnthat determines whether theReadFromKafkaDoFnshould stop reading from the givenTopicPartition. - withCheckStopReadingFn(SerializableFunction<TopicPartition, Boolean>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
A custom
SerializableFunctionthat determines whether theReadFromKafkaDoFnshould stop reading from the givenTopicPartition. - withConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Replaces the configuration for the main consumer.
- withConsumerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Update configuration for the backend main consumer.
- withConsumerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Updates configuration for the main consumer.
- withConsumerFactoryFn(SerializableFunction<Map<String, Object>, ? extends Consumer<?, ?>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withConsumerFactoryFn(SerializableFunction), used to keep the compatibility with old API based on KV type of element. - withConsumerFactoryFn(SerializableFunction<Map<String, Object>, ? extends Consumer<?, ?>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
When exactly-once semantics are enabled (see
KafkaIO.WriteRecords.withEOS(int, String)), the sink needs to fetch previously stored state with Kafka topic. - withConsumerFactoryFn(SerializableFunction<Map<String, Object>, Consumer<byte[], byte[]>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
A factory to create Kafka
Consumerfrom consumer configuration. - withConsumerFactoryFn(SerializableFunction<Map<String, Object>, Consumer<byte[], byte[]>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A factory to create Kafka
Consumerfrom consumer configuration. - withConsumerPollingTimeout(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets the timeout time in seconds for Kafka consumer polling request in the
ReadFromKafkaDoFn. - withConsumerPollingTimeout(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets the timeout time in seconds for Kafka consumer polling request in the
ReadFromKafkaDoFn. - withCreateTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the creation time of
KafkaRecordas the output timestamp. - withCreateTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets the timestamps policy based on
KafkaTimestampType.CREATE_TIMEtimestamp of the records. - withCreateTime(Duration) - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
CustomTimestampPolicyWithLimitedDelayusingKafkaTimestampType.CREATE_TIMEfrom the record for timestamp. - withCreatWatermarkEstimatorFn(SerializableFunction<Instant, WatermarkEstimator<Instant>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A function to create a
WatermarkEstimator. - withDynamicRead(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Configure the KafkaIO to use
WatchForKafkaTopicPartitionsto detect and emit any new availableTopicPartitionforReadFromKafkaDoFnto consume during pipeline execution time. - withElementTimestamp() - Static method in interface org.apache.beam.sdk.io.kafka.KafkaPublishTimestampFunction
-
Returns
KafkaPublishTimestampFunctionreturns element timestamp from ProcessContext. - withEOS(int, String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withEOS(int, String), used to keep the compatibility with old API based on KV type of element. - withEOS(int, String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Provides exactly-once semantics while writing to Kafka, which enables applications with end-to-end exactly-once guarantees on top of exactly-once semantics within Beam pipelines.
- withExtractOutputTimestampFn(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
A function to calculate output timestamp for a given
KafkaRecord. - withGCPApplicationDefaultCredentials() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Creates and sets the Application Default Credentials for a Kafka consumer.
- withGCPApplicationDefaultCredentials() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Creates and sets the Application Default Credentials for a Kafka producer.
- withInputTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withInputTimestamp(), used to keep the compatibility with old API based on KV type of element. - withInputTimestamp() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
The timestamp for each record being published is set to timestamp of the element in the pipeline.
- withKeyDeserializer(Class<? extends Deserializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a Kafka
Deserializerto interpret key bytes read from Kafka. - withKeyDeserializer(Class<? extends Deserializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets a Kafka
Deserializerto interpret key bytes read from Kafka. - withKeyDeserializer(DeserializerProvider<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withKeyDeserializerAndCoder(Class<? extends Deserializer<K>>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a Kafka
Deserializerfor interpreting key bytes read from Kafka along with aCoderfor helping the Beam runner materialize key objects at runtime if necessary. - withKeyDeserializerAndCoder(Class<? extends Deserializer<K>>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets a Kafka
Deserializerfor interpreting key bytes read from Kafka along with aCoderfor helping the Beam runner materialize key objects at runtime if necessary. - withKeyDeserializerProvider(DeserializerProvider<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withKeyDeserializerProviderAndCoder(DeserializerProvider<K>, Coder<K>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withKeySerializer(Class<? extends Serializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withKeySerializer(Class), used to keep the compatibility with old API based on KV type of element. - withKeySerializer(Class<? extends Serializer<K>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Sets a
Serializerfor serializing key (if any) to bytes. - withLogAppendTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withLogAppendTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the log append time as the output timestamp.
- withLogAppendTime() - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
A
TimestampPolicythat assigns Kafka's log append time (server side ingestion time) to each record. - withManualWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the
WatermarkEstimators.Manualas the watermark estimator. - withMaxNumRecords(long) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Similar to
Read.Unbounded.withMaxNumRecords(long). - withMaxReadTime(Duration) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Similar to
Read.Unbounded.withMaxReadTime(Duration). - withMonotonicallyIncreasingWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the
WatermarkEstimators.MonotonicallyIncreasingas the watermark estimator. - withOffsetConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Set additional configuration for the offset consumer.
- withOffsetConsumerConfigOverrides(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Set additional configuration for the backend offset consumer.
- withoutMetadata() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Returns a
PTransformfor PCollection ofKV, dropping Kafka metatdata. - withProcessingTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withProcessingTime() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the processing time as the output timestamp.
- withProcessingTime() - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
A
TimestampPolicythat assigns processing time to each record. - withProducerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Update configuration for the producer.
- withProducerConfigUpdates(Map<String, Object>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Update configuration for the producer.
- withProducerFactoryFn(SerializableFunction<Map<String, Object>, Producer<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withProducerFactoryFn(SerializableFunction), used to keep the compatibility with old API based on KV type of element. - withProducerFactoryFn(SerializableFunction<Map<String, Object>, Producer<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Sets a custom function to create Kafka producer.
- withPublishTimestampFunction(KafkaPublishTimestampFunction<KV<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Deprecated.use
KafkaIO.WriteRecordsandProducerRecordsto set publish timestamp. - withPublishTimestampFunction(KafkaPublishTimestampFunction<ProducerRecord<K, V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Deprecated.use
ProducerRecordsto set publish timestamp. - withReadCommitted() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets "isolation_level" to "read_committed" in Kafka consumer configuration.
- withReadCommitted() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets "isolation_level" to "read_committed" in Kafka consumer configuration.
- withRedistribute() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets redistribute transform that hints to the runner to try to redistribute the work evenly.
- withRedistribute() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Enable Redistribute.
- withRedistributeNumKeys(int) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withRedistributeNumKeys(int) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withStartReadTime(Instant) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Use timestamp to set up start offset.
- withStopReadTime(Instant) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Use timestamp to set up stop offset.
- withTimestampFn(SerializableFunction<KafkaRecord<K, V>, Instant>) - Static method in interface org.apache.beam.sdk.io.kafka.TimestampPolicyFactory
-
Deprecated.
- withTimestampFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.4. Use
KafkaIO.Read.withTimestampPolicyFactory(TimestampPolicyFactory)instead. - withTimestampFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.4. Use
KafkaIO.Read.withTimestampPolicyFactory(TimestampPolicyFactory)instead. - withTimestampPolicyFactory(TimestampPolicyFactory<K, V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Provide custom
TimestampPolicyFactoryto set event times and watermark for each partition. - withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets the topic to read from.
- withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withTopic(String), used to keep the compatibility with old API based on KV type of element. - withTopic(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Sets the default Kafka topic to write to.
- withTopicPartitions(List<TopicPartition>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a list of partitions to read from.
- withTopicPattern(String) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Internally sets a
Patternof topics to read from. - withTopics(List<String>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a list of topics to read from.
- withValueDeserializer(Class<? extends Deserializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a Kafka
Deserializerto interpret value bytes read from Kafka. - withValueDeserializer(Class<? extends Deserializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets a Kafka
Deserializerto interpret value bytes read from Kafka. - withValueDeserializer(DeserializerProvider<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withValueDeserializerAndCoder(Class<? extends Deserializer<V>>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Sets a Kafka
Deserializerfor interpreting value bytes read from Kafka along with aCoderfor helping the Beam runner materialize value objects at runtime if necessary. - withValueDeserializerAndCoder(Class<? extends Deserializer<V>>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Sets a Kafka
Deserializerfor interpreting value bytes read from Kafka along with aCoderfor helping the Beam runner materialize value objects at runtime if necessary. - withValueDeserializerProvider(DeserializerProvider<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
- withValueDeserializerProviderAndCoder(DeserializerProvider<V>, Coder<V>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
- withValueSerializer(Class<? extends Serializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Write
-
Wrapper method over
KafkaIO.WriteRecords.withValueSerializer(Class), used to keep the compatibility with old API based on KV type of element. - withValueSerializer(Class<? extends Serializer<V>>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
-
Sets a
Serializerfor serializing value to bytes. - withWallTimeWatermarkEstimator() - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.ReadSourceDescriptors
-
Use the
WatermarkEstimators.WallTimeas the watermark estimator. - withWatermarkFn(SerializableFunction<KV<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.4. Use
KafkaIO.Read.withTimestampPolicyFactory(TimestampPolicyFactory)instead. - withWatermarkFn2(SerializableFunction<KafkaRecord<K, V>, Instant>) - Method in class org.apache.beam.sdk.io.kafka.KafkaIO.Read
-
Deprecated.as of version 2.4. Use
KafkaIO.Read.withTimestampPolicyFactory(TimestampPolicyFactory)instead. - write() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
Creates an uninitialized
KafkaIO.WritePTransform. - Write() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.Write
- writeRecords() - Static method in class org.apache.beam.sdk.io.kafka.KafkaIO
-
Creates an uninitialized
KafkaIO.WriteRecordsPTransform. - WriteRecords() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaIO.WriteRecords
- WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.kafka.KafkaSchemaTransformTranslation.WriteRegistrar
All Classes All Packages