Class TestDStream<T>
- java.lang.Object
-
- org.apache.spark.streaming.dstream.DStream<T>
-
- org.apache.spark.streaming.dstream.InputDStream<org.apache.beam.sdk.util.WindowedValue<T>>
-
- org.apache.beam.runners.spark.translation.streaming.TestDStream<T>
-
- All Implemented Interfaces:
java.io.Serializable,org.apache.spark.internal.Logging,scala.Serializable
public class TestDStream<T> extends org.apache.spark.streaming.dstream.InputDStream<org.apache.beam.sdk.util.WindowedValue<T>>- See Also:
- Serialized Form
-
-
Constructor Summary
Constructors Constructor Description TestDStream(org.apache.beam.sdk.testing.TestStream<T> test, org.apache.spark.streaming.StreamingContext ssc)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description scala.Option<org.apache.spark.rdd.RDD<org.apache.beam.sdk.util.WindowedValue<T>>>compute(org.apache.spark.streaming.Time validTime)voidstart()voidstop()-
Methods inherited from class org.apache.spark.streaming.dstream.InputDStream
baseScope, dependencies, id, isTimeValid, lastValidTime, lastValidTime_$eq, name, rateController, slideDuration
-
Methods inherited from class org.apache.spark.streaming.dstream.DStream
cache, checkpoint, checkpointData, checkpointDuration, checkpointDuration_$eq, clearCheckpointData, clearMetadata, context, count, countByValue, countByValue$default$1, countByValue$default$2, countByValueAndWindow, countByValueAndWindow$default$3, countByValueAndWindow$default$4, countByWindow, createRDDWithLocalProperties, creationSite, filter, flatMap, foreachRDD, foreachRDD, generatedRDDs, generatedRDDs_$eq, generateJob, getOrCompute, glom, graph, graph_$eq, initialize, initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isInitialized, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, map, mapPartitions, mapPartitions$default$2, mustCheckpoint, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, parentRememberDuration, persist, persist, print, print, reduce, reduceByWindow, reduceByWindow, register, remember, rememberDuration, rememberDuration_$eq, repartition, restoreCheckpointData, saveAsObjectFiles, saveAsObjectFiles$default$2, saveAsTextFiles, saveAsTextFiles$default$2, setContext, setGraph, slice, slice, ssc, ssc_$eq, storageLevel, storageLevel_$eq, toPairDStreamFunctions, toPairDStreamFunctions$default$4, transform, transform, transformWith, transformWith, union, updateCheckpointData, validateAtStart, window, window, zeroTime, zeroTime_$eq
-
-
-
-
Constructor Detail
-
TestDStream
public TestDStream(org.apache.beam.sdk.testing.TestStream<T> test, org.apache.spark.streaming.StreamingContext ssc)
-
-
Method Detail
-
compute
public scala.Option<org.apache.spark.rdd.RDD<org.apache.beam.sdk.util.WindowedValue<T>>> compute(org.apache.spark.streaming.Time validTime)
- Specified by:
computein classorg.apache.spark.streaming.dstream.DStream<org.apache.beam.sdk.util.WindowedValue<T>>
-
start
public void start()
- Specified by:
startin classorg.apache.spark.streaming.dstream.InputDStream<org.apache.beam.sdk.util.WindowedValue<T>>
-
stop
public void stop()
- Specified by:
stopin classorg.apache.spark.streaming.dstream.InputDStream<org.apache.beam.sdk.util.WindowedValue<T>>
-
-