Compression

org.apache.pekko.stream.javadsl.Compression
object Compression

Attributes

Source
Compression.scala
Graph
Supertypes
class Object
trait Matchable
class Any
Self type

Members list

Value members

Concrete methods

Creates a flow that deflate-compresses a stream of ByteString. Note that the compressor will flush after every single element in stream so that it is guaranteed that every pekko.util.ByteString coming out of the flow can be fully decompressed without waiting for additional data. This may come at a compression performance cost for very small chunks.

Creates a flow that deflate-compresses a stream of ByteString. Note that the compressor will flush after every single element in stream so that it is guaranteed that every pekko.util.ByteString coming out of the flow can be fully decompressed without waiting for additional data. This may come at a compression performance cost for very small chunks.

Attributes

Source
Compression.scala
def deflate(level: Int, nowrap: Boolean): Flow[ByteString, ByteString, NotUsed]

Same as deflate with configurable level and nowrap

Same as deflate with configurable level and nowrap

Value parameters

level

Compression level (0-9)

nowrap

if true then use GZIP compatible compression

Attributes

Source
Compression.scala
def deflate(level: Int, nowrap: Boolean, autoFlush: Boolean): Flow[ByteString, ByteString, NotUsed]

Same as deflate with configurable level, nowrap and autoFlush.

Same as deflate with configurable level, nowrap and autoFlush.

Value parameters

autoFlush

If true will automatically flush after every single element in the stream.

level

Compression level (0-9)

nowrap

if true then use GZIP compatible compression

Attributes

Since

1.3.0

Source
Compression.scala

Creates a flow that gzip-compresses a stream of ByteStrings. Note that the compressor will flush after every single element in stream so that it is guaranteed that every pekko.util.ByteString coming out of the flow can be fully decompressed without waiting for additional data. This may come at a compression performance cost for very small chunks.

Creates a flow that gzip-compresses a stream of ByteStrings. Note that the compressor will flush after every single element in stream so that it is guaranteed that every pekko.util.ByteString coming out of the flow can be fully decompressed without waiting for additional data. This may come at a compression performance cost for very small chunks.

Attributes

Source
Compression.scala

Same as gzip with a custom level.

Same as gzip with a custom level.

Value parameters

level

Compression level (0-9)

Attributes

Source
Compression.scala
def gzip(level: Int, autoFlush: Boolean): Flow[ByteString, ByteString, NotUsed]

Same as gzip with a custom level and configurable flush mode.

Same as gzip with a custom level and configurable flush mode.

Value parameters

autoFlush

If true will automatically flush after every single element in the stream.

level

Compression level (0-9)

Attributes

Since

1.3.0

Source
Compression.scala
def gzipDecompress(maxBytesPerChunk: Int): Flow[ByteString, ByteString, NotUsed]

Creates a Flow that decompresses gzip-compressed stream of data.

Creates a Flow that decompresses gzip-compressed stream of data.

Value parameters

maxBytesPerChunk

Maximum length of the output pekko.util.ByteString chunk.

Attributes

Since

1.3.0

Source
Compression.scala
def inflate(maxBytesPerChunk: Int): Flow[ByteString, ByteString, NotUsed]

Creates a Flow that decompresses deflate-compressed stream of data.

Creates a Flow that decompresses deflate-compressed stream of data.

Value parameters

maxBytesPerChunk

Maximum length of the output pekko.util.ByteString chunk.

Attributes

Source
Compression.scala
def inflate(maxBytesPerChunk: Int, nowrap: Boolean): Flow[ByteString, ByteString, NotUsed]

Same as inflate with configurable maximum output length and nowrap

Same as inflate with configurable maximum output length and nowrap

Value parameters

maxBytesPerChunk

Maximum length of the output pekko.util.ByteString chunk.

nowrap

if true then use GZIP compatible decompression

Attributes

Source
Compression.scala

Deprecated methods

def gunzip(maxBytesPerChunk: Int): Flow[ByteString, ByteString, NotUsed]

Creates a Flow that decompresses gzip-compressed stream of data.

Creates a Flow that decompresses gzip-compressed stream of data.

Value parameters

maxBytesPerChunk

Maximum length of the output pekko.util.ByteString chunk.

Attributes

Deprecated
[Since version Pekko 1.3.0] Use gzipDecompress instead
Source
Compression.scala