package kafka011
- Alphabetic
- By Inheritance
- kafka011
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
case class
AssignStrategy(partitions: Array[TopicPartition]) extends ConsumerStrategy with Product with Serializable
Specify a fixed collection of partitions.
-
sealed
trait
ConsumerStrategy extends AnyRef
Subscribe allows you to subscribe to a fixed collection of topics.
Subscribe allows you to subscribe to a fixed collection of topics. SubscribePattern allows you to use a regex to specify topics of interest. Note that unlike the 0.8 integration, using Subscribe or SubscribePattern should respond to adding partitions during a running stream. Finally, Assign allows you to specify a fixed collection of partitions. All three strategies have overloaded constructors that allow you to specify the starting offset for a particular partition.
-
case class
KafkaContinuousInputPartition(topicPartition: TopicPartition, startOffset: Long, kafkaParams: Map[String, AnyRef], pollTimeoutMs: Long, failOnDataLoss: Boolean) extends ContinuousInputPartition[InternalRow] with Product with Serializable
An input partition for continuous Kafka processing.
An input partition for continuous Kafka processing. This will be serialized and transformed into a full reader on executors.
- topicPartition
The (topic, partition) pair this task is responsible for.
- startOffset
The offset to start reading from within the partition.
- kafkaParams
Kafka consumer params to use.
- pollTimeoutMs
The timeout for Kafka consumer polling.
- failOnDataLoss
Flag indicating whether data reader should fail if some offsets are skipped.
-
class
KafkaContinuousInputPartitionReader extends ContinuousInputPartitionReader[InternalRow]
A per-task data reader for continuous Kafka processing.
-
class
KafkaContinuousReader extends ContinuousReader with Logging
A ContinuousReader for data from kafka.
-
class
KafkaStreamDataWriter extends KafkaRowWriter with DataWriter[InternalRow]
A DataWriter for Kafka writing.
A DataWriter for Kafka writing. One data writer will be created in each partition to process incoming rows.
-
class
KafkaStreamWriter extends StreamWriter
A StreamWriter for Kafka writing.
A StreamWriter for Kafka writing. Responsible for generating the writer factory.
-
case class
KafkaStreamWriterFactory(topic: Option[String], producerParams: Map[String, AnyRef], schema: StructType) extends DataWriterFactory[InternalRow] with Product with Serializable
A DataWriterFactory for Kafka writing.
A DataWriterFactory for Kafka writing. Will be serialized and sent to executors to generate the per-task data writers.
- topic
The topic that should be written to. If None, topic will be inferred from a
topicfield in the incoming data.- producerParams
Parameters for Kafka producers in each task.
- schema
The schema of the input data.
- type PartitionOffsetMap = Map[TopicPartition, Long]
-
case class
SubscribePatternStrategy(topicPattern: String) extends ConsumerStrategy with Product with Serializable
Use a regex to specify topics of interest.
-
case class
SubscribeStrategy(topics: Seq[String]) extends ConsumerStrategy with Product with Serializable
Subscribe to a fixed collection of topics.
Value Members
-
object
KafkaWriterCommitMessage extends WriterCommitMessage with Product with Serializable
Dummy commit message.
Dummy commit message. The DataSourceV2 framework requires a commit message implementation but we don't need to really send one.