A B C D E F G H I L M N P R S T U V W 

A

addColumn(ColumnChunkMetaData) - Method in class parquet.hadoop.metadata.BlockMetaData
 

B

BadConfigurationException - Exception in parquet.hadoop
Thrown when the input/output formats are misconfigured
BadConfigurationException() - Constructor for exception parquet.hadoop.BadConfigurationException
 
BadConfigurationException(String, Throwable) - Constructor for exception parquet.hadoop.BadConfigurationException
 
BadConfigurationException(String) - Constructor for exception parquet.hadoop.BadConfigurationException
 
BadConfigurationException(Throwable) - Constructor for exception parquet.hadoop.BadConfigurationException
 
BenchmarkCounter - Class in parquet.hadoop.util.counters
Encapsulate counter operations, compatible with Hadoop1/2, mapred/mapreduce API
BenchmarkCounter() - Constructor for class parquet.hadoop.util.counters.BenchmarkCounter
 
BenchmarkCounter.NullCounter - Class in parquet.hadoop.util.counters
 
BenchmarkCounter.NullCounter() - Constructor for class parquet.hadoop.util.counters.BenchmarkCounter.NullCounter
 
BLOCK_SIZE - Static variable in class parquet.hadoop.ParquetOutputFormat
 
BlockMetaData - Class in parquet.hadoop.metadata
Block metadata stored in the footer and passed in an InputSplit
BlockMetaData() - Constructor for class parquet.hadoop.metadata.BlockMetaData
 

C

close() - Method in class parquet.hadoop.ParquetFileReader
 
close() - Method in class parquet.hadoop.ParquetReader
 
close() - Method in class parquet.hadoop.ParquetRecordReader
close(TaskAttemptContext) - Method in class parquet.hadoop.ParquetRecordWriter
close() - Method in class parquet.hadoop.ParquetWriter
 
CodecConfig - Class in parquet.hadoop.codec
Template class and factory for accessing codec related configurations in different APIs(mapreduce or mapred), use CodecConfig.from(org.apache.hadoop.mapred.JobConf) for mapred API, use CodecConfig.from(org.apache.hadoop.mapreduce.TaskAttemptContext) for mapreduce API
CodecConfig() - Constructor for class parquet.hadoop.codec.CodecConfig
 
ColumnChunkMetaData - Class in parquet.hadoop.metadata
Column meta data for a block stored in the file footer and passed in the InputSplit
ColumnChunkMetaData(ColumnChunkProperties) - Constructor for class parquet.hadoop.metadata.ColumnChunkMetaData
 
ColumnChunkProperties - Class in parquet.hadoop.metadata
 
ColumnPath - Class in parquet.hadoop.metadata
 
commitJob(JobContext) - Method in class parquet.hadoop.ParquetOutputCommitter
 
compress(byte[], int, int) - Method in class parquet.hadoop.codec.SnappyCompressor
Fills specified buffer with compressed data.
COMPRESSION - Static variable in class parquet.hadoop.ParquetOutputFormat
 
CompressionCodecName - Enum in parquet.hadoop.metadata
 
CompressionCodecNotSupportedException - Exception in parquet.hadoop.codec
This exception will be thrown when the codec is not supported by parquet, meaning there is no matching codec defined in CompressionCodecName
CompressionCodecNotSupportedException(Class) - Constructor for exception parquet.hadoop.codec.CompressionCodecNotSupportedException
 
ConfigurationUtil - Class in parquet.hadoop.util
 
ConfigurationUtil() - Constructor for class parquet.hadoop.util.ConfigurationUtil
 
Container<T> - Class in parquet.hadoop.mapred
A simple container of objects that you can get and set.
Container() - Constructor for class parquet.hadoop.mapred.Container
 
ContextUtil - Class in parquet.hadoop.util
Utility methods to allow applications to deal with inconsistencies between MapReduce Context Objects API between hadoop-0.20 and later versions.
ContextUtil() - Constructor for class parquet.hadoop.util.ContextUtil
 
CounterLoader - Interface in parquet.hadoop.util.counters
Factory interface for CounterLoaders, will load the counter according to groupName, counterName, and if in the configuration, flag with name counterFlag is false, the counter will not be loaded
createCompressor() - Method in class parquet.hadoop.codec.SnappyCodec
 
createDecompressor() - Method in class parquet.hadoop.codec.SnappyCodec
 
createInputStream(InputStream) - Method in class parquet.hadoop.codec.SnappyCodec
 
createInputStream(InputStream, Decompressor) - Method in class parquet.hadoop.codec.SnappyCodec
 
createOutputStream(OutputStream) - Method in class parquet.hadoop.codec.SnappyCodec
 
createOutputStream(OutputStream, Compressor) - Method in class parquet.hadoop.codec.SnappyCodec
 
createRecordReader(InputSplit, TaskAttemptContext) - Method in class parquet.hadoop.ParquetInputFormat
CURRENT_VERSION - Static variable in class parquet.hadoop.ParquetFileWriter
 

D

decompress(byte[], int, int) - Method in class parquet.hadoop.codec.SnappyDecompressor
Fills specified buffer with uncompressed data.
DEFAULT_BLOCK_SIZE - Static variable in class parquet.hadoop.ParquetWriter
 
DEFAULT_COMPRESSION_CODEC_NAME - Static variable in class parquet.hadoop.ParquetWriter
 
DEFAULT_IS_DICTIONARY_ENABLED - Static variable in class parquet.hadoop.ParquetWriter
 
DEFAULT_IS_VALIDATING_ENABLED - Static variable in class parquet.hadoop.ParquetWriter
 
DEFAULT_PAGE_SIZE - Static variable in class parquet.hadoop.ParquetWriter
 
DEFAULT_WRITER_VERSION - Static variable in class parquet.hadoop.ParquetWriter
 
DeprecatedParquetInputFormat<V> - Class in parquet.hadoop.mapred
 
DeprecatedParquetInputFormat() - Constructor for class parquet.hadoop.mapred.DeprecatedParquetInputFormat
 
DeprecatedParquetOutputFormat<V> - Class in parquet.hadoop.mapred
 
DeprecatedParquetOutputFormat() - Constructor for class parquet.hadoop.mapred.DeprecatedParquetOutputFormat
 
DICTIONARY_PAGE_SIZE - Static variable in class parquet.hadoop.ParquetOutputFormat
 

E

ENABLE_DICTIONARY - Static variable in class parquet.hadoop.ParquetOutputFormat
 
EncodingList - Class in parquet.hadoop.metadata
 
end() - Method in class parquet.hadoop.codec.SnappyCompressor
 
end() - Method in class parquet.hadoop.codec.SnappyDecompressor
 
end(Map<String, String>) - Method in class parquet.hadoop.ParquetFileWriter
ends a file once all blocks have been written.
endBlock() - Method in class parquet.hadoop.ParquetFileWriter
ends a block once all column chunks have been written
endColumn() - Method in class parquet.hadoop.ParquetFileWriter
end a column (once all rep, def and data have been written)
equals(Object) - Method in class parquet.hadoop.metadata.ColumnChunkProperties
 
equals(Object) - Method in class parquet.hadoop.metadata.ColumnPath
 
equals(Object) - Method in class parquet.hadoop.metadata.EncodingList
 
ExampleInputFormat - Class in parquet.hadoop.example
Example input format to read Parquet files This Input format uses a rather inefficient data model but works independently of higher level abstractions.
ExampleInputFormat() - Constructor for class parquet.hadoop.example.ExampleInputFormat
 
ExampleOutputFormat - Class in parquet.hadoop.example
An example output format must be provided the schema up front
ExampleOutputFormat() - Constructor for class parquet.hadoop.example.ExampleOutputFormat
 

F

FileMetaData - Class in parquet.hadoop.metadata
File level meta data (Schema, codec, ...)
FileMetaData(MessageType, Map<String, String>, String) - Constructor for class parquet.hadoop.metadata.FileMetaData
 
finish() - Method in class parquet.hadoop.codec.SnappyCompressor
 
finished() - Method in class parquet.hadoop.codec.SnappyCompressor
 
finished() - Method in class parquet.hadoop.codec.SnappyDecompressor
 
Footer - Class in parquet.hadoop
Represent the footer for a given file
Footer(Path, ParquetMetadata) - Constructor for class parquet.hadoop.Footer
 
from(JobConf) - Static method in class parquet.hadoop.codec.CodecConfig
use mapred api to read codec config
from(TaskAttemptContext) - Static method in class parquet.hadoop.codec.CodecConfig
use mapreduce api to read codec config
fromCompressionCodec(Class<?>) - Static method in enum parquet.hadoop.metadata.CompressionCodecName
 
fromConf(String) - Static method in enum parquet.hadoop.metadata.CompressionCodecName
 
fromJSON(String) - Static method in class parquet.hadoop.metadata.ParquetMetadata
 
fromParquet(CompressionCodec) - Static method in enum parquet.hadoop.metadata.CompressionCodecName
 
fromParquetMetadata(FileMetaData) - Method in class parquet.format.converter.ParquetMetadataConverter
 

G

get() - Method in class parquet.hadoop.mapred.Container
 
get(ColumnPath, PrimitiveType.PrimitiveTypeName, CompressionCodecName, Set<Encoding>, long, long, long, long, long) - Static method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
get(ColumnPath, PrimitiveType.PrimitiveTypeName, CompressionCodecName, Set<Encoding>) - Static method in class parquet.hadoop.metadata.ColumnChunkProperties
 
get(String...) - Static method in class parquet.hadoop.metadata.ColumnPath
 
getBlocks() - Method in class parquet.hadoop.metadata.ParquetMetadata
 
getBlocks() - Method in class parquet.hadoop.ParquetInputSplit
 
getBlockSize(JobContext) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getBlockSize(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getBytesRead() - Method in class parquet.hadoop.codec.SnappyCompressor
 
getBytesRead() - Static method in class parquet.hadoop.util.counters.BenchmarkCounter
 
getBytesWritten() - Method in class parquet.hadoop.codec.SnappyCompressor
 
getClassFromConfig(Configuration, String, Class<?>) - Static method in class parquet.hadoop.util.ConfigurationUtil
 
getCodec() - Method in class parquet.hadoop.codec.CodecConfig
 
getCodec() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getCodec() - Method in class parquet.hadoop.metadata.ColumnChunkProperties
 
getCodecClass() - Method in exception parquet.hadoop.codec.CompressionCodecNotSupportedException
 
getColumns() - Method in class parquet.hadoop.metadata.BlockMetaData
 
getCompression(JobContext) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getCompression(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getCompressorType() - Method in class parquet.hadoop.codec.SnappyCodec
 
getConf() - Method in class parquet.hadoop.codec.SnappyCodec
 
getConfiguration() - Method in class parquet.hadoop.api.InitContext
 
getConfiguration() - Method in class parquet.hadoop.codec.CodecConfig
 
getConfiguration(JobContext) - Static method in class parquet.hadoop.util.ContextUtil
Invoke getConfiguration() method on JobContext.
getCount() - Method in class parquet.hadoop.util.counters.BenchmarkCounter.NullCounter
 
getCount() - Method in interface parquet.hadoop.util.counters.ICounter
 
getCount() - Method in class parquet.hadoop.util.counters.mapred.MapRedCounterAdapter
 
getCount() - Method in class parquet.hadoop.util.counters.mapreduce.MapReduceCounterAdapter
 
getCounter(TaskInputOutputContext, String, String) - Static method in class parquet.hadoop.util.ContextUtil
 
getCounterByNameAndFlag(String, String, String) - Method in interface parquet.hadoop.util.counters.CounterLoader
 
getCounterByNameAndFlag(String, String, String) - Method in class parquet.hadoop.util.counters.mapred.MapRedCounterLoader
 
getCounterByNameAndFlag(String, String, String) - Method in class parquet.hadoop.util.counters.mapreduce.MapReduceCounterLoader
 
getCreatedBy() - Method in class parquet.hadoop.metadata.FileMetaData
 
getCreatedBy() - Method in class parquet.hadoop.metadata.GlobalMetaData
 
getCurrentKey() - Method in class parquet.hadoop.ParquetRecordReader
always returns null
getCurrentValue() - Method in class parquet.hadoop.ParquetRecordReader
getDecompressorType() - Method in class parquet.hadoop.codec.SnappyCodec
 
getDefaultExtension() - Method in class parquet.hadoop.codec.SnappyCodec
 
getDictionaryPageOffset() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getDictionaryPageSize(JobContext) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getDictionaryPageSize(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getEnableDictionary(JobContext) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getEnableDictionary(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getEncoding(Encoding) - Method in class parquet.format.converter.ParquetMetadataConverter
 
getEncoding(Encoding) - Method in class parquet.format.converter.ParquetMetadataConverter
 
getEncodingList(List<Encoding>) - Static method in class parquet.hadoop.metadata.EncodingList
 
getEncodings() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getEncodings() - Method in class parquet.hadoop.metadata.ColumnChunkProperties
 
getExtension() - Method in enum parquet.hadoop.metadata.CompressionCodecName
 
getExtraMetaData() - Method in class parquet.hadoop.api.WriteSupport.WriteContext
 
getExtraMetadata() - Method in class parquet.hadoop.ParquetInputSplit
 
getFile() - Method in class parquet.hadoop.Footer
 
getFileMetaData() - Method in class parquet.hadoop.metadata.ParquetMetadata
 
getFileSchema() - Method in class parquet.hadoop.api.InitContext
this is the union of all the schemas when reading multiple files.
getFileSchema() - Method in class parquet.hadoop.ParquetInputSplit
 
getFirstDataPageOffset() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getFooters(JobConf) - Method in class parquet.hadoop.mapred.DeprecatedParquetInputFormat
 
getFooters(JobContext) - Method in class parquet.hadoop.ParquetInputFormat
 
getFooters(Configuration, List<FileStatus>) - Method in class parquet.hadoop.ParquetInputFormat
the footers for the files
getGlobalMetaData(JobContext) - Method in class parquet.hadoop.ParquetInputFormat
 
getHadoopCompressionCodecClass() - Method in enum parquet.hadoop.metadata.CompressionCodecName
 
getHadoopCompressionCodecClassName() - Method in enum parquet.hadoop.metadata.CompressionCodecName
 
getHadoopOutputCompressorClass(Class) - Method in class parquet.hadoop.codec.CodecConfig
 
getKeyValueMetadata() - Method in class parquet.hadoop.api.InitContext
each key is associated with the list of distinct values found in footers
getKeyValueMetaData() - Method in class parquet.hadoop.metadata.FileMetaData
 
getKeyValueMetaData() - Method in class parquet.hadoop.metadata.GlobalMetaData
 
getMergedKeyValueMetaData() - Method in class parquet.hadoop.api.InitContext
Deprecated.
getOutputCommitter(TaskAttemptContext) - Method in class parquet.hadoop.ParquetOutputFormat
 
getPageSize(JobContext) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getPageSize(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getParquetCompressionCodec(Configuration) - Static method in class parquet.hadoop.codec.CodecConfig
 
getParquetCompressionCodec() - Method in enum parquet.hadoop.metadata.CompressionCodecName
 
getParquetMetadata() - Method in class parquet.hadoop.Footer
 
getPath() - Method in class parquet.hadoop.metadata.BlockMetaData
 
getPath() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getPath() - Method in class parquet.hadoop.metadata.ColumnChunkProperties
 
getPos() - Method in class parquet.hadoop.ParquetFileWriter
 
getProgress() - Method in class parquet.hadoop.ParquetRecordReader
getReadSupport(Configuration) - Method in class parquet.hadoop.ParquetInputFormat
 
getReadSupportClass(Configuration) - Static method in class parquet.hadoop.ParquetInputFormat
 
getReadSupportMetadata() - Method in class parquet.hadoop.api.ReadSupport.ReadContext
 
getReadSupportMetadata() - Method in class parquet.hadoop.ParquetInputSplit
 
getRecordReader(InputSplit, JobConf, Reporter) - Method in class parquet.hadoop.mapred.DeprecatedParquetInputFormat
 
getRecordWriter(FileSystem, JobConf, String, Progressable) - Method in class parquet.hadoop.mapred.DeprecatedParquetOutputFormat
 
getRecordWriter(TaskAttemptContext) - Method in class parquet.hadoop.ParquetOutputFormat
getRecordWriter(TaskAttemptContext, Path) - Method in class parquet.hadoop.ParquetOutputFormat
 
getRecordWriter(Configuration, Path, CompressionCodecName) - Method in class parquet.hadoop.ParquetOutputFormat
 
getRemaining() - Method in class parquet.hadoop.codec.SnappyDecompressor
 
getRequestedSchema() - Method in class parquet.hadoop.api.ReadSupport.ReadContext
 
getRequestedSchema() - Method in class parquet.hadoop.ParquetInputSplit
 
getRowCount() - Method in class parquet.hadoop.metadata.BlockMetaData
 
getSchema() - Method in class parquet.hadoop.api.WriteSupport.WriteContext
 
getSchema(Job) - Static method in class parquet.hadoop.example.ExampleOutputFormat
retrieve the schema from the conf
getSchema(Configuration) - Static method in class parquet.hadoop.example.GroupWriteSupport
 
getSchema() - Method in class parquet.hadoop.metadata.FileMetaData
 
getSchema() - Method in class parquet.hadoop.metadata.GlobalMetaData
 
getSchemaForRead(MessageType, String) - Static method in class parquet.hadoop.api.ReadSupport
attempts to validate and construct a MessageType from a read projection schema
getSchemaForRead(MessageType, MessageType) - Static method in class parquet.hadoop.api.ReadSupport
 
getSplits(JobConf, int) - Method in class parquet.hadoop.mapred.DeprecatedParquetInputFormat
 
getSplits(JobContext) - Method in class parquet.hadoop.ParquetInputFormat
getSplits(Configuration, List<Footer>) - Method in class parquet.hadoop.ParquetInputFormat
 
getTime() - Static method in class parquet.hadoop.util.counters.BenchmarkCounter
 
getTotalBytes() - Static method in class parquet.hadoop.util.counters.BenchmarkCounter
 
getTotalByteSize() - Method in class parquet.hadoop.metadata.BlockMetaData
 
getTotalSize() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getTotalUncompressedSize() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getType() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getType() - Method in class parquet.hadoop.metadata.ColumnChunkProperties
 
getUnboundRecordFilter(Configuration) - Static method in class parquet.hadoop.ParquetInputFormat
 
getValidation(JobContext) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getValidation(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getValueCount() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
getWriterVersion(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
getWriteSupport(Configuration) - Method in class parquet.hadoop.ParquetOutputFormat
 
getWriteSupportClass(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
GlobalMetaData - Class in parquet.hadoop.metadata
Merged metadata when reading from multiple files.
GlobalMetaData(MessageType, Map<String, Set<String>>, Set<String>) - Constructor for class parquet.hadoop.metadata.GlobalMetaData
 
GroupReadSupport - Class in parquet.hadoop.example
 
GroupReadSupport() - Constructor for class parquet.hadoop.example.GroupReadSupport
 
GroupWriteSupport - Class in parquet.hadoop.example
 
GroupWriteSupport() - Constructor for class parquet.hadoop.example.GroupWriteSupport
 

H

hashCode() - Method in class parquet.hadoop.metadata.ColumnChunkProperties
 
hashCode() - Method in class parquet.hadoop.metadata.ColumnPath
 
hashCode() - Method in class parquet.hadoop.metadata.EncodingList
 

I

ICounter - Interface in parquet.hadoop.util.counters
Interface for counters in mapred/mapreduce package of hadoop
increment(long) - Method in class parquet.hadoop.util.counters.BenchmarkCounter.NullCounter
 
increment(long) - Method in interface parquet.hadoop.util.counters.ICounter
 
increment(long) - Method in class parquet.hadoop.util.counters.mapred.MapRedCounterAdapter
 
increment(long) - Method in class parquet.hadoop.util.counters.mapreduce.MapReduceCounterAdapter
 
incrementBytesRead(long) - Static method in class parquet.hadoop.util.counters.BenchmarkCounter
 
incrementCounter(Counter, long) - Static method in class parquet.hadoop.util.ContextUtil
 
incrementTime(long) - Static method in class parquet.hadoop.util.counters.BenchmarkCounter
 
incrementTotalBytes(long) - Static method in class parquet.hadoop.util.counters.BenchmarkCounter
 
init(Configuration, Map<String, String>, MessageType) - Method in class parquet.hadoop.api.ReadSupport
Deprecated.
init(InitContext) - Method in class parquet.hadoop.api.ReadSupport
called in InputFormat.getSplits(org.apache.hadoop.mapreduce.JobContext) in the front end
init(Configuration) - Method in class parquet.hadoop.api.WriteSupport
called first in the task
init(Configuration, Map<String, String>, MessageType) - Method in class parquet.hadoop.example.GroupReadSupport
 
init(Configuration) - Method in class parquet.hadoop.example.GroupWriteSupport
 
InitContext - Class in parquet.hadoop.api
Context passed to ReadSupport when initializing for read
InitContext(Configuration, Map<String, Set<String>>, MessageType) - Constructor for class parquet.hadoop.api.InitContext
 
initCounterFromContext(TaskInputOutputContext<?, ?, ?, ?>) - Static method in class parquet.hadoop.util.counters.BenchmarkCounter
Init counters in hadoop's mapreduce API, support both 1.x and 2.x
initCounterFromReporter(Reporter, Configuration) - Static method in class parquet.hadoop.util.counters.BenchmarkCounter
Init counters in hadoop's mapred API, which is used by cascading and Hive.
initialize(InputSplit, TaskAttemptContext) - Method in class parquet.hadoop.ParquetRecordReader
initialize(InputSplit, Configuration, Reporter) - Method in class parquet.hadoop.ParquetRecordReader
 
isCompressionSet(JobContext) - Static method in class parquet.hadoop.ParquetOutputFormat
 
isCompressionSet(Configuration) - Static method in class parquet.hadoop.ParquetOutputFormat
 
isHadoopCompressionSet() - Method in class parquet.hadoop.codec.CodecConfig
 
isParquetCompressionSet(Configuration) - Static method in class parquet.hadoop.codec.CodecConfig
 
iterator() - Method in class parquet.hadoop.metadata.ColumnPath
 
iterator() - Method in class parquet.hadoop.metadata.EncodingList
 

L

listStatus(JobContext) - Method in class parquet.hadoop.ParquetInputFormat
 

M

MAGIC - Static variable in class parquet.hadoop.ParquetFileWriter
 
main(String[]) - Static method in class parquet.hadoop.PrintFooter
 
MapRedCounterAdapter - Class in parquet.hadoop.util.counters.mapred
Adapt a mapred counter to ICounter
MapRedCounterAdapter(Counters.Counter) - Constructor for class parquet.hadoop.util.counters.mapred.MapRedCounterAdapter
 
MapRedCounterLoader - Class in parquet.hadoop.util.counters.mapred
Concrete factory for counters in mapred API, get a counter using mapred API when the corresponding flag is set, otherwise return a NullCounter
MapRedCounterLoader(Reporter, Configuration) - Constructor for class parquet.hadoop.util.counters.mapred.MapRedCounterLoader
 
MapReduceCounterAdapter - Class in parquet.hadoop.util.counters.mapreduce
Adapt a mapreduce counter to ICounter
MapReduceCounterAdapter(Counter) - Constructor for class parquet.hadoop.util.counters.mapreduce.MapReduceCounterAdapter
 
MapReduceCounterLoader - Class in parquet.hadoop.util.counters.mapreduce
Concrete factory for counters in mapred API, get a counter using mapreduce API when the corresponding flag is set, otherwise return a NullCounter
MapReduceCounterLoader(TaskInputOutputContext<?, ?, ?, ?>) - Constructor for class parquet.hadoop.util.counters.mapreduce.MapReduceCounterLoader
 
merge() - Method in class parquet.hadoop.metadata.GlobalMetaData
Will merge the metadata as if it was coming from a single file.

N

needsDictionary() - Method in class parquet.hadoop.codec.SnappyDecompressor
 
needsInput() - Method in class parquet.hadoop.codec.SnappyCompressor
 
needsInput() - Method in class parquet.hadoop.codec.SnappyDecompressor
 
newGenericCounter(String, String, long) - Static method in class parquet.hadoop.util.ContextUtil
 
newJobContext(Configuration, JobID) - Static method in class parquet.hadoop.util.ContextUtil
Creates JobContext from a JobConf and jobId using the correct constructor for based on Hadoop version.
newTaskAttemptContext(Configuration, TaskAttemptID) - Static method in class parquet.hadoop.util.ContextUtil
Creates TaskAttempContext from a JobConf and jobId using the correct constructor for based on Hadoop version.
nextKeyValue() - Method in class parquet.hadoop.ParquetRecordReader
NonBlockedCompressorStream - Class in parquet.hadoop.codec
CompressorStream class that should be used instead of the default hadoop CompressorStream object.
NonBlockedCompressorStream(OutputStream, Compressor, int) - Constructor for class parquet.hadoop.codec.NonBlockedCompressorStream
 
NonBlockedDecompressorStream - Class in parquet.hadoop.codec
DecompressorStream class that should be used instead of the default hadoop DecompressorStream object.
NonBlockedDecompressorStream(InputStream, Decompressor, int) - Constructor for class parquet.hadoop.codec.NonBlockedDecompressorStream
 

P

PAGE_SIZE - Static variable in class parquet.hadoop.ParquetOutputFormat
 
parquet.format.converter - package parquet.format.converter
 
parquet.hadoop - package parquet.hadoop
Provides classes to store use Parquet files in Hadoop In a map reduce job:
parquet.hadoop.api - package parquet.hadoop.api
APIs to integrate various type systems with Parquet
parquet.hadoop.codec - package parquet.hadoop.codec
 
parquet.hadoop.example - package parquet.hadoop.example
 
parquet.hadoop.mapred - package parquet.hadoop.mapred
 
parquet.hadoop.metadata - package parquet.hadoop.metadata
 
parquet.hadoop.util - package parquet.hadoop.util
 
parquet.hadoop.util.counters - package parquet.hadoop.util.counters
 
parquet.hadoop.util.counters.mapred - package parquet.hadoop.util.counters.mapred
 
parquet.hadoop.util.counters.mapreduce - package parquet.hadoop.util.counters.mapreduce
 
PARQUET_EXAMPLE_SCHEMA - Static variable in class parquet.hadoop.example.GroupWriteSupport
 
PARQUET_METADATA_FILE - Static variable in class parquet.hadoop.ParquetFileWriter
 
PARQUET_READ_SCHEMA - Static variable in class parquet.hadoop.api.ReadSupport
configuration key for a parquet read projection schema
ParquetFileReader - Class in parquet.hadoop
Internal implementation of the Parquet file reader as a block container
ParquetFileReader(Configuration, Path, List<BlockMetaData>, List<ColumnDescriptor>) - Constructor for class parquet.hadoop.ParquetFileReader
 
ParquetFileWriter - Class in parquet.hadoop
Internal implementation of the Parquet file writer as a block container
ParquetFileWriter(Configuration, MessageType, Path) - Constructor for class parquet.hadoop.ParquetFileWriter
 
ParquetInputFormat<T> - Class in parquet.hadoop
The input format to read a Parquet file.
ParquetInputFormat() - Constructor for class parquet.hadoop.ParquetInputFormat
Hadoop will instantiate using this constructor
ParquetInputFormat(Class<S>) - Constructor for class parquet.hadoop.ParquetInputFormat
constructor used when this InputFormat in wrapped in another one (In Pig for example)
ParquetInputSplit - Class in parquet.hadoop
An input split for the Parquet format It contains the information to read one block of the file.
ParquetInputSplit() - Constructor for class parquet.hadoop.ParquetInputSplit
Writables must have a parameterless constructor
ParquetInputSplit(Path, long, long, String[], List<BlockMetaData>, String, String, Map<String, String>, Map<String, String>) - Constructor for class parquet.hadoop.ParquetInputSplit
ParquetMetadata - Class in parquet.hadoop.metadata
Meta Data block stored in the footer of the file contains file level (Codec, Schema, ...) and block level (location, columns, record count, ...) meta data
ParquetMetadata(FileMetaData, List<BlockMetaData>) - Constructor for class parquet.hadoop.metadata.ParquetMetadata
 
ParquetMetadataConverter - Class in parquet.format.converter
 
ParquetMetadataConverter() - Constructor for class parquet.format.converter.ParquetMetadataConverter
 
ParquetOutputCommitter - Class in parquet.hadoop
 
ParquetOutputCommitter(Path, TaskAttemptContext) - Constructor for class parquet.hadoop.ParquetOutputCommitter
 
ParquetOutputFormat<T> - Class in parquet.hadoop
OutputFormat to write to a Parquet file It requires a WriteSupport to convert the actual records to the underlying format.
ParquetOutputFormat(S) - Constructor for class parquet.hadoop.ParquetOutputFormat
constructor used when this OutputFormat in wrapped in another one (In Pig for example)
ParquetOutputFormat() - Constructor for class parquet.hadoop.ParquetOutputFormat
used when directly using the output format and configuring the write support implementation using parquet.write.support.class
ParquetReader<T> - Class in parquet.hadoop
Read records from a Parquet file.
ParquetReader(Path, ReadSupport<T>) - Constructor for class parquet.hadoop.ParquetReader
 
ParquetReader(Configuration, Path, ReadSupport<T>) - Constructor for class parquet.hadoop.ParquetReader
 
ParquetReader(Path, ReadSupport<T>, UnboundRecordFilter) - Constructor for class parquet.hadoop.ParquetReader
 
ParquetReader(Configuration, Path, ReadSupport<T>, UnboundRecordFilter) - Constructor for class parquet.hadoop.ParquetReader
 
ParquetRecordReader<T> - Class in parquet.hadoop
Reads the records from a block of a Parquet file
ParquetRecordReader(ReadSupport<T>) - Constructor for class parquet.hadoop.ParquetRecordReader
 
ParquetRecordReader(ReadSupport<T>, UnboundRecordFilter) - Constructor for class parquet.hadoop.ParquetRecordReader
 
ParquetRecordWriter<T> - Class in parquet.hadoop
Writes records to a Parquet file
ParquetRecordWriter(ParquetFileWriter, WriteSupport<T>, MessageType, Map<String, String>, int, int, CodecFactory.BytesCompressor, int, boolean, boolean, ParquetProperties.WriterVersion) - Constructor for class parquet.hadoop.ParquetRecordWriter
 
ParquetWriter<T> - Class in parquet.hadoop
Write records to a Parquet file.
ParquetWriter(Path, WriteSupport<T>, CompressionCodecName, int, int) - Constructor for class parquet.hadoop.ParquetWriter
Create a new ParquetWriter.
ParquetWriter(Path, WriteSupport<T>, CompressionCodecName, int, int, boolean, boolean) - Constructor for class parquet.hadoop.ParquetWriter
Create a new ParquetWriter.
ParquetWriter(Path, WriteSupport<T>, CompressionCodecName, int, int, int, boolean, boolean) - Constructor for class parquet.hadoop.ParquetWriter
Create a new ParquetWriter.
ParquetWriter(Path, WriteSupport<T>, CompressionCodecName, int, int, int, boolean, boolean, ParquetProperties.WriterVersion) - Constructor for class parquet.hadoop.ParquetWriter
Create a new ParquetWriter.
ParquetWriter(Path, WriteSupport<T>, CompressionCodecName, int, int, int, boolean, boolean, ParquetProperties.WriterVersion, Configuration) - Constructor for class parquet.hadoop.ParquetWriter
Create a new ParquetWriter.
ParquetWriter(Path, WriteSupport<T>) - Constructor for class parquet.hadoop.ParquetWriter
Create a new ParquetWriter.
positiveLongFitsInAnInt(long) - Static method in class parquet.hadoop.metadata.ColumnChunkMetaData
checks that a positive long value fits in an int.
prepareForRead(Configuration, Map<String, String>, MessageType, ReadSupport.ReadContext) - Method in class parquet.hadoop.api.ReadSupport
called in RecordReader.initialize(org.apache.hadoop.mapreduce.InputSplit, org.apache.hadoop.mapreduce.TaskAttemptContext) in the back end the returned RecordConsumer will materialize the records and add them to the destination
prepareForRead(Configuration, Map<String, String>, MessageType, ReadSupport.ReadContext) - Method in class parquet.hadoop.example.GroupReadSupport
 
prepareForWrite(RecordConsumer) - Method in class parquet.hadoop.api.WriteSupport
This will be called once per row group
prepareForWrite(RecordConsumer) - Method in class parquet.hadoop.example.GroupWriteSupport
 
PrintFooter - Class in parquet.hadoop
Utility to print footer information
PrintFooter() - Constructor for class parquet.hadoop.PrintFooter
 

R

read(byte[], int, int) - Method in class parquet.hadoop.codec.NonBlockedDecompressorStream
 
read() - Method in class parquet.hadoop.ParquetReader
 
READ_SUPPORT_CLASS - Static variable in class parquet.hadoop.ParquetInputFormat
key to configure the ReadSupport implementation
readAllFootersInParallel(Configuration, List<FileStatus>) - Static method in class parquet.hadoop.ParquetFileReader
 
readAllFootersInParallel(Configuration, FileStatus) - Static method in class parquet.hadoop.ParquetFileReader
 
readAllFootersInParallelUsingSummaryFiles(Configuration, List<FileStatus>) - Static method in class parquet.hadoop.ParquetFileReader
for files provided, check if there's a summary file.
readFields(DataInput) - Method in class parquet.hadoop.ParquetInputSplit
readFooter(Configuration, Path) - Static method in class parquet.hadoop.ParquetFileReader
Reads the meta data block in the footer of the file
readFooter(Configuration, FileStatus) - Static method in class parquet.hadoop.ParquetFileReader
Reads the meta data block in the footer of the file
readFooters(Configuration, FileStatus) - Static method in class parquet.hadoop.ParquetFileReader
 
readFooters(Configuration, Path) - Static method in class parquet.hadoop.ParquetFileReader
 
readNextRowGroup() - Method in class parquet.hadoop.ParquetFileReader
Reads all the columns requested from the row group at the current file position.
readParquetMetadata(InputStream) - Method in class parquet.format.converter.ParquetMetadataConverter
 
readSummaryFile(Configuration, FileStatus) - Static method in class parquet.hadoop.ParquetFileReader
 
ReadSupport<T> - Class in parquet.hadoop.api
Abstraction used by the ParquetInputFormat to materialize records
ReadSupport() - Constructor for class parquet.hadoop.api.ReadSupport
 
ReadSupport.ReadContext - Class in parquet.hadoop.api
information to read the file
ReadSupport.ReadContext(MessageType) - Constructor for class parquet.hadoop.api.ReadSupport.ReadContext
 
ReadSupport.ReadContext(MessageType, Map<String, String>) - Constructor for class parquet.hadoop.api.ReadSupport.ReadContext
 
realInputFormat - Variable in class parquet.hadoop.mapred.DeprecatedParquetInputFormat
 
realOutputFormat - Variable in class parquet.hadoop.mapred.DeprecatedParquetOutputFormat
 
reinit(Configuration) - Method in class parquet.hadoop.codec.SnappyCompressor
 
reset() - Method in class parquet.hadoop.codec.SnappyCompressor
 
reset() - Method in class parquet.hadoop.codec.SnappyDecompressor
 

S

set(T) - Method in class parquet.hadoop.mapred.Container
 
setBlockSize(Configuration, int) - Static method in class parquet.hadoop.mapred.DeprecatedParquetOutputFormat
 
setBlockSize(Job, int) - Static method in class parquet.hadoop.ParquetOutputFormat
 
setCompression(Configuration, CompressionCodecName) - Static method in class parquet.hadoop.mapred.DeprecatedParquetOutputFormat
 
setCompression(Job, CompressionCodecName) - Static method in class parquet.hadoop.ParquetOutputFormat
 
setConf(Configuration) - Method in class parquet.hadoop.codec.SnappyCodec
 
setDictionary(byte[], int, int) - Method in class parquet.hadoop.codec.SnappyCompressor
 
setDictionary(byte[], int, int) - Method in class parquet.hadoop.codec.SnappyDecompressor
 
setDictionaryPageSize(Job, int) - Static method in class parquet.hadoop.ParquetOutputFormat
 
setEnableDictionary(Configuration, boolean) - Static method in class parquet.hadoop.mapred.DeprecatedParquetOutputFormat
 
setEnableDictionary(Job, boolean) - Static method in class parquet.hadoop.ParquetOutputFormat
 
setInput(byte[], int, int) - Method in class parquet.hadoop.codec.SnappyCompressor
 
setInput(byte[], int, int) - Method in class parquet.hadoop.codec.SnappyDecompressor
Sets input data for decompression.
setPageSize(Configuration, int) - Static method in class parquet.hadoop.mapred.DeprecatedParquetOutputFormat
 
setPageSize(Job, int) - Static method in class parquet.hadoop.ParquetOutputFormat
 
setPath(String) - Method in class parquet.hadoop.metadata.BlockMetaData
 
setReadSupportClass(Job, Class<?>) - Static method in class parquet.hadoop.ParquetInputFormat
 
setReadSupportClass(JobConf, Class<?>) - Static method in class parquet.hadoop.ParquetInputFormat
 
setRowCount(long) - Method in class parquet.hadoop.metadata.BlockMetaData
 
setSchema(Job, MessageType) - Static method in class parquet.hadoop.example.ExampleOutputFormat
set the schema being written to the job conf
setSchema(MessageType, Configuration) - Static method in class parquet.hadoop.example.GroupWriteSupport
 
setTotalByteSize(long) - Method in class parquet.hadoop.metadata.BlockMetaData
 
setUnboundRecordFilter(Job, Class<? extends UnboundRecordFilter>) - Static method in class parquet.hadoop.ParquetInputFormat
 
setValidation(JobContext, boolean) - Static method in class parquet.hadoop.ParquetOutputFormat
 
setValidation(Configuration, boolean) - Static method in class parquet.hadoop.ParquetOutputFormat
 
setWriteSupportClass(Configuration, Class<?>) - Static method in class parquet.hadoop.mapred.DeprecatedParquetOutputFormat
 
setWriteSupportClass(Job, Class<?>) - Static method in class parquet.hadoop.ParquetOutputFormat
 
setWriteSupportClass(JobConf, Class<?>) - Static method in class parquet.hadoop.ParquetOutputFormat
 
size() - Method in class parquet.hadoop.metadata.ColumnPath
 
size() - Method in class parquet.hadoop.metadata.EncodingList
 
SnappyCodec - Class in parquet.hadoop.codec
Snappy compression codec for Parquet.
SnappyCodec() - Constructor for class parquet.hadoop.codec.SnappyCodec
 
SnappyCompressor - Class in parquet.hadoop.codec
This class is a wrapper around the snappy compressor.
SnappyCompressor() - Constructor for class parquet.hadoop.codec.SnappyCompressor
 
SnappyDecompressor - Class in parquet.hadoop.codec
 
SnappyDecompressor() - Constructor for class parquet.hadoop.codec.SnappyDecompressor
 
SnappyUtil - Class in parquet.hadoop.codec
Utilities for SnappyCompressor and SnappyDecompressor.
SnappyUtil() - Constructor for class parquet.hadoop.codec.SnappyUtil
 
start() - Method in class parquet.hadoop.ParquetFileWriter
start the file
startBlock(long) - Method in class parquet.hadoop.ParquetFileWriter
start a block
startColumn(ColumnDescriptor, long, CompressionCodecName) - Method in class parquet.hadoop.ParquetFileWriter
start a column inside a block

T

toArray() - Method in class parquet.hadoop.metadata.ColumnPath
 
toJSON(ParquetMetadata) - Static method in class parquet.hadoop.metadata.ParquetMetadata
 
toList() - Method in class parquet.hadoop.metadata.EncodingList
 
toParquetMetadata(int, ParquetMetadata) - Method in class parquet.format.converter.ParquetMetadataConverter
 
toPrettyJSON(ParquetMetadata) - Static method in class parquet.hadoop.metadata.ParquetMetadata
 
toString() - Method in class parquet.hadoop.Footer
 
toString() - Method in class parquet.hadoop.metadata.BlockMetaData
 
toString() - Method in class parquet.hadoop.metadata.ColumnChunkMetaData
 
toString() - Method in class parquet.hadoop.metadata.ColumnChunkProperties
 
toString() - Method in class parquet.hadoop.metadata.ColumnPath
 
toString() - Method in class parquet.hadoop.metadata.FileMetaData
 
toString() - Method in class parquet.hadoop.metadata.GlobalMetaData
 
toString() - Method in class parquet.hadoop.metadata.ParquetMetadata
 
toString() - Method in class parquet.hadoop.ParquetInputSplit
 

U

UNBOUND_RECORD_FILTER - Static variable in class parquet.hadoop.ParquetInputFormat
key to configure the filter

V

validateBuffer(byte[], int, int) - Static method in class parquet.hadoop.codec.SnappyUtil
 
VALIDATION - Static variable in class parquet.hadoop.ParquetOutputFormat
 
valueOf(String) - Static method in enum parquet.hadoop.metadata.CompressionCodecName
Returns the enum constant of this type with the specified name.
values() - Static method in enum parquet.hadoop.metadata.CompressionCodecName
Returns an array containing the constants of this enum type, in the order they are declared.

W

write(T) - Method in class parquet.hadoop.api.WriteSupport
called once per record
write(byte[], int, int) - Method in class parquet.hadoop.codec.NonBlockedCompressorStream
 
write(Group) - Method in class parquet.hadoop.example.GroupWriteSupport
 
write(DataOutput) - Method in class parquet.hadoop.ParquetInputSplit
write(Void, T) - Method in class parquet.hadoop.ParquetRecordWriter
write(T) - Method in class parquet.hadoop.ParquetWriter
 
WRITE_SUPPORT_CLASS - Static variable in class parquet.hadoop.ParquetOutputFormat
 
writeDataPage(int, int, BytesInput, Encoding, Encoding, Encoding) - Method in class parquet.hadoop.ParquetFileWriter
writes a single page
writeDataPageHeader(int, int, int, Encoding, Encoding, Encoding, OutputStream) - Method in class parquet.format.converter.ParquetMetadataConverter
 
writeDictionaryPage(DictionaryPage) - Method in class parquet.hadoop.ParquetFileWriter
writes a dictionary page page
writeDictionaryPageHeader(int, int, int, Encoding, OutputStream) - Method in class parquet.format.converter.ParquetMetadataConverter
 
writeMetadataFile(Configuration, Path, List<Footer>) - Static method in class parquet.hadoop.ParquetFileWriter
writes a _metadata file
WRITER_VERSION - Static variable in class parquet.hadoop.ParquetOutputFormat
 
WriteSupport<T> - Class in parquet.hadoop.api
Abstraction to use with ParquetOutputFormat to convert incoming records
WriteSupport() - Constructor for class parquet.hadoop.api.WriteSupport
 
WriteSupport.WriteContext - Class in parquet.hadoop.api
information to be persisted in the file
WriteSupport.WriteContext(MessageType, Map<String, String>) - Constructor for class parquet.hadoop.api.WriteSupport.WriteContext
 
A B C D E F G H I L M N P R S T U V W 

Copyright © 2014. All Rights Reserved.