Package

za.co.absa.spark

hofs

Permalink

package hofs

The package object represents A Scala wrapper for all high-order functions.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. hofs
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. object Extensions

    Permalink

    The object is a container of extension methods used by the wrapper of high-order functions.

  2. def aggregate(array: Column, zero: Column, merge: (Column, Column) ⇒ Column, finish: (Column) ⇒ Column, accumulatorName: String, elementName: String): Column

    Permalink

    Applies the binary merge function to gradually reduce the zero element and all elements from the input array into one element.

    Applies the binary merge function to gradually reduce the zero element and all elements from the input array into one element. The function is an equivalent of the foldLeft function from functional programming. After obtaining a single element the function converts the element into a suitable form by passing the element into the finish function.

    array

    A column of input arrays

    zero

    A column of zero elements

    merge

    A function that takes and and accumulator and a given element and returns another accumulator for the next iteration

    finish

    A function converting the reduced element into a suitable form

    accumulatorName

    A name of the lambda variable representing an accumulator

    elementName

    A name of the lambda variable representing an array element

    returns

    A column with single elements

  3. def aggregate(array: Column, zero: Column, merge: (Column, Column) ⇒ Column, finish: (Column) ⇒ Column): Column

    Permalink

    Applies the binary merge function to gradually reduce the zero element and all elements from the input array into one element.

    Applies the binary merge function to gradually reduce the zero element and all elements from the input array into one element. The function is an equivalent of the foldLeft function from functional programming. After obtaining a single element the function converts the element into a suitable form by passing the element into the finish function. The lambda variable for the accumulator will be represented as acc and the lambda variable for the element as elm in Spark execution plans.

    array

    A column of input arrays

    zero

    A column of zero elements

    merge

    A function that takes and and accumulator and a given element and returns another accumulator for the next iteration

    finish

    A function converting the reduced element into a suitable form.

    returns

    A column with single elements

  4. def aggregate(array: Column, zero: Column, merge: (Column, Column) ⇒ Column, accumulatorName: String, elementName: String): Column

    Permalink

    Applies the binary merge function to gradually reduce the zero element and all elements from the input array into one element.

    Applies the binary merge function to gradually reduce the zero element and all elements from the input array into one element. The function is an equivalent of the foldLeft function from functional programming.

    array

    A column of input arrays

    zero

    A column of zero elements

    merge

    A function that takes and and accumulator and a given element and returns another accumulator for the next iteration

    accumulatorName

    A name of the lambda variable representing an accumulator

    elementName

    A name of the lambda variable representing an array element

    returns

    A column with single elements

  5. def aggregate(array: Column, zero: Column, merge: (Column, Column) ⇒ Column): Column

    Permalink

    Applies the binary merge function to gradually reduce the zero element and all elements from the input array into one element.

    Applies the binary merge function to gradually reduce the zero element and all elements from the input array into one element. The function is an equivalent of the foldLeft function from functional programming. The lambda variable for the accumulator will be represented as acc and the lambda variable for the element as elm in Spark execution plans.

    array

    A column of input arrays

    zero

    A column of zero elements

    merge

    A function that takes and and accumulator and a given element and returns another accumulator for the next iteration

    returns

    A column with single elements

  6. def filter(array: Column, f: (Column) ⇒ Column, elementName: String): Column

    Permalink

    Filters out elements that do not satisfy the predicate f from the input array.

    Filters out elements that do not satisfy the predicate f from the input array.

    array

    An input column of arrays

    f

    A function representing the predicate

    elementName

    The name of the lambda variable representing an array element within the predicate logic

    returns

    A column of filtered arrays (All elements within the arrays satisfy the predicate).

  7. def filter(array: Column, f: (Column) ⇒ Column): Column

    Permalink

    Filters out elements that do not satisfy the predicate f from the input array.

    Filters out elements that do not satisfy the predicate f from the input array. The lambda variable within the predicate will be seen as elm in Spark execution plans.

    array

    An input column of arrays

    f

    A function representing the predicate

    returns

    A column of filtered arrays (All elements within the arrays satisfy the predicate).

  8. def transform(array: Column, f: (Column, Column) ⇒ Column, elementName: String, indexName: String): Column

    Permalink

    Applies the function f to every element in the array.

    Applies the function f to every element in the array. The function 'f also obtains an index of a given element when iterating over the array. The index starts from 0.

    array

    A column of arrays

    f

    A function transforming individual elements of the array

    elementName

    The name of the lambda variable representing an array element

    indexName

    The name of the lambda variable representing the index that changes with each iteration

    returns

    A column of arrays with transformed elements

  9. def transform(array: Column, f: (Column, Column) ⇒ Column): Column

    Permalink

    Applies the function f to every element in the array.

    Applies the function f to every element in the array. The function 'f also obtains an index of a given element when iterating over the array. The index starts from 0. The lambda variable will be seen as elm and the index as idx in Spark execution plans.

    array

    A column of arrays

    f

    A function transforming individual elements of the array

    returns

    A column of arrays with transformed elements

  10. def transform(array: Column, f: (Column) ⇒ Column, elementName: String): Column

    Permalink

    Applies the function f to every element in the array.

    Applies the function f to every element in the array. The method is an equivalent to the map function from functional programming.

    array

    A column of arrays

    f

    A function transforming individual elements of the array

    elementName

    The name of the lambda variable. The value is used in Spark execution plans.

    returns

    A column of arrays with transformed elements

  11. def transform(array: Column, f: (Column) ⇒ Column): Column

    Permalink

    Applies the function f to every element in the array.

    Applies the function f to every element in the array. The method is an equivalent to the map function from functional programming. The lambda variable will be seen as elm in Spark execution plans.

    array

    A column of arrays

    f

    A function transforming individual elements of the array

    returns

    A column of arrays with transformed elements

  12. def zip_with(left: Column, right: Column, f: (Column, Column) ⇒ Column, leftElementName: String, rightElementName: String): Column

    Permalink

    Merges two arrays into one by iterating over the both arrays at the same time and calling the function f.

    Merges two arrays into one by iterating over the both arrays at the same time and calling the function f. If one array is shorter, null elements are appended this array to be the same length as the longer array. Values returned from the functions f will become elements of the output array.

    left

    An input column of arrays

    right

    An input column of arrays

    f

    A function producing result elements based on two elements from the input arrays

    leftElementName

    The name of the lambda variable representing an element of the first array

    rightElementName

    The name of the lambda variable representing an element of the second array

    returns

    A column of merged arrays

  13. def zip_with(left: Column, right: Column, f: (Column, Column) ⇒ Column): Column

    Permalink

    Merges two arrays into one by iterating over the both arrays at the same time and calling the function f.

    Merges two arrays into one by iterating over the both arrays at the same time and calling the function f. If one array is shorter, null elements are appended this array to be the same length as the longer array. Values returned from the functions f will become elements of the output array. The lambda variables indicating input elements to the function f will be seen as left and right in Spark execution plans.

    left

    An input column of arrays

    right

    An input column of arrays

    f

    A function producing result elements based on two elements from the input arrays

    returns

    A column of merged arrays

Inherited from AnyRef

Inherited from Any

Ungrouped