package flink
- Alphabetic
- Public
- All
Type Members
-
class
AsyncKVStoreWriter extends RichAsyncFunction[PutRequest, WriteResponse]
Async Flink writer function to help us write to the KV store.
-
case class
AvroCodecFn[T](groupByServingInfoParsed: GroupByServingInfoParsed) extends RichFlatMapFunction[Map[String, Any], PutRequest] with Product with Serializable
A Flink function that is responsible for converting the Spark expr eval output and converting that to a form that can be written out to the KV store (PutRequest object)
A Flink function that is responsible for converting the Spark expr eval output and converting that to a form that can be written out to the KV store (PutRequest object)
- T
The input data type
- groupByServingInfoParsed
The GroupBy we are working with
-
class
FlinkJob[T] extends AnyRef
Flink job that processes a single streaming GroupBy and writes out the results (raw events in untiled, pre-aggregates in case of tiled) to the KV store.
Flink job that processes a single streaming GroupBy and writes out the results (raw events in untiled, pre-aggregates in case of tiled) to the KV store. At a high level, the operators are structured as follows: Kafka source -> Spark expression eval -> Avro conversion -> KV store writer Kafka source - Reads objects of type T (specific case class, Thrift / Proto) from a Kafka topic Spark expression eval - Evaluates the Spark SQL expression in the GroupBy and projects and filters the input data Avro conversion - Converts the Spark expr eval output to a form that can be written out to the KV store (PutRequest object) KV store writer - Writes the PutRequest objects to the KV store using the AsyncDataStream API
In the untiled version there are no-shuffles and thus this ends up being a single node in the Flink DAG (with the above 4 operators and parallelism as injected by the user)
- T
- The input data type
- abstract class FlinkSource[T] extends Serializable
-
class
SparkExpressionEvalFn[T] extends RichFlatMapFunction[T, Map[String, Any]]
A Flink function that uses Chronon's CatalystUtil to evaluate the Spark SQL expression in a GroupBy.
A Flink function that uses Chronon's CatalystUtil to evaluate the Spark SQL expression in a GroupBy. This function is instantiated for a given type T (specific case class object, Thrift / Proto object). Based on the selects and where clauses in the GroupBy, this function projects and filters the input data and emits a Map which contains the relevant fields & values that are needed to compute the aggregated values for the GroupBy.
- T
The type of the input data.
- case class WriteResponse(putRequest: PutRequest, status: Boolean) extends Product with Serializable
Value Members
- object AsyncKVStoreWriter extends Serializable