Packages

class ContinuousMemoryStream[A] extends MemoryStreamBase[A] with ContinuousStream

The overall strategy here is: * ContinuousMemoryStream maintains a list of records for each partition. addData() will distribute records evenly-ish across partitions. * RecordEndpoint is set up as an endpoint for executor-side ContinuousMemoryStreamInputPartitionReader instances to poll. It returns the record at the specified offset within the list, or null if that offset doesn't yet have a record.

Linear Supertypes
ContinuousStream, MemoryStreamBase[A], SparkDataStream, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. ContinuousMemoryStream
  2. ContinuousStream
  3. MemoryStreamBase
  4. SparkDataStream
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new ContinuousMemoryStream(id: Int, sqlContext: SQLContext, numPartitions: Int = 2)(implicit arg0: Encoder[A])

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def addData(data: TraversableOnce[A]): connector.read.streaming.Offset
  5. def addData(data: A*): connector.read.streaming.Offset
    Definition Classes
    MemoryStreamBase
  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. val attributes: Seq[AttributeReference]
    Attributes
    protected
    Definition Classes
    MemoryStreamBase
  8. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  9. def commit(end: connector.read.streaming.Offset): Unit
    Definition Classes
    ContinuousMemoryStreamMemoryStreamBase → SparkDataStream
  10. def createContinuousReaderFactory(): ContinuousPartitionReaderFactory
    Definition Classes
    ContinuousMemoryStream → ContinuousStream
  11. def deserializeOffset(json: String): ContinuousMemoryStreamOffset
    Definition Classes
    ContinuousMemoryStreamMemoryStreamBase → SparkDataStream
  12. val encoder: ExpressionEncoder[A]
    Definition Classes
    MemoryStreamBase
  13. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  14. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  15. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  16. def fullSchema(): StructType
    Definition Classes
    MemoryStreamBase
  17. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  18. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  19. def initialOffset(): connector.read.streaming.Offset
    Definition Classes
    ContinuousMemoryStreamMemoryStreamBase → SparkDataStream
  20. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  21. val logicalPlan: LogicalPlan
    Attributes
    protected
    Definition Classes
    MemoryStreamBase
  22. def mergeOffsets(offsets: Array[PartitionOffset]): ContinuousMemoryStreamOffset
    Definition Classes
    ContinuousMemoryStream → ContinuousStream
  23. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  24. def needsReconfiguration(): Boolean
    Definition Classes
    ContinuousStream
  25. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  26. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  27. def planInputPartitions(start: connector.read.streaming.Offset): Array[InputPartition]
    Definition Classes
    ContinuousMemoryStream → ContinuousStream
  28. def stop(): Unit
    Definition Classes
    ContinuousMemoryStream → SparkDataStream
  29. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  30. def toDF(): DataFrame
    Definition Classes
    MemoryStreamBase
  31. def toDS(): Dataset[A]
    Definition Classes
    MemoryStreamBase
  32. lazy val toRow: Serializer[A]
    Attributes
    protected
    Definition Classes
    MemoryStreamBase
  33. def toString(): String
    Definition Classes
    AnyRef → Any
  34. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  35. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  36. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()

Inherited from ContinuousStream

Inherited from MemoryStreamBase[A]

Inherited from SparkDataStream

Inherited from AnyRef

Inherited from Any

Ungrouped