Uses of Class
io.trino.plugin.hive.HiveColumnHandle
-
-
Uses of HiveColumnHandle in io.trino.plugin.hive
Methods in io.trino.plugin.hive that return HiveColumnHandle Modifier and Type Method Description static HiveColumnHandleHiveColumnHandle. bucketColumnHandle()The column indicating the bucket id.static HiveColumnHandleHiveColumnHandle. createBaseColumn(String topLevelColumnName, int topLevelColumnIndex, HiveType hiveType, Type type, HiveColumnHandle.ColumnType columnType, Optional<String> comment)static HiveColumnHandleHiveColumnHandle. fileModifiedTimeColumnHandle()static HiveColumnHandleHiveColumnHandle. fileSizeColumnHandle()HiveColumnHandleHiveColumnHandle. getBaseColumn()static HiveColumnHandleHiveColumnHandle. getDeleteRowIdColumnHandle()HiveColumnHandleHivePageSourceProvider.ColumnMapping. getHiveColumnHandle()static HiveColumnHandleHiveUpdateProcessor. getUpdateRowIdColumnHandle(List<HiveColumnHandle> nonUpdatedColumnHandles)Return the column UPDATE column handle, which depends on the 3 ACID columns as well as the non-updated columns.static HiveColumnHandleHiveColumnHandle. partitionColumnHandle()static HiveColumnHandleHiveColumnHandle. pathColumnHandle()static HiveColumnHandleHiveColumnHandle. updateRowIdColumnHandle(List<HiveColumnHandle> columnHandles, List<ColumnHandle> updatedColumns)Methods in io.trino.plugin.hive that return types with arguments of type HiveColumnHandle Modifier and Type Method Description List<HiveColumnHandle>HiveUpdateProcessor. getAllDataColumns()List<HiveColumnHandle>HiveSplit.BucketConversion. getBucketColumnHandles()List<HiveColumnHandle>BackgroundHiveSplitLoader.BucketSplitInfo. getBucketColumns()List<HiveColumnHandle>HiveSplit.BucketValidation. getBucketColumns()List<HiveColumnHandle>HiveBucketHandle. getColumns()TupleDomain<HiveColumnHandle>HivePartitionResult. getCompactEffectivePredicate()TupleDomain<HiveColumnHandle>HiveTableHandle. getCompactEffectivePredicate()List<HiveColumnHandle>HiveTableHandle. getDataColumns()List<HiveColumnHandle>HiveWritableTableHandle. getInputColumns()List<HiveColumnHandle>HiveUpdateProcessor. getNonUpdatedColumns()List<HiveColumnHandle>HivePartitionResult. getPartitionColumns()List<HiveColumnHandle>HiveTableHandle. getPartitionColumns()List<HiveColumnHandle>HiveUpdateProcessor. getUpdatedColumns()List<HiveColumnHandle>HiveUpdateProcessor. mergeWithNonUpdatedColumns(List<HiveColumnHandle> updateDependencies)Merge the non-updated columns with the update dependencies, in allDataColumns order, and finally add the rowId column as the last dependency.static List<HiveColumnHandle>HivePageSourceProvider.ColumnMapping. toColumnHandles(List<HivePageSourceProvider.ColumnMapping> regularColumnMappings, boolean doCoercion, TypeManager typeManager)Methods in io.trino.plugin.hive with parameters of type HiveColumnHandle Modifier and Type Method Description static HivePageSourceProvider.ColumnMappingHivePageSourceProvider.ColumnMapping. empty(HiveColumnHandle hiveColumnHandle)static HivePageSourceProvider.ColumnMappingHivePageSourceProvider.ColumnMapping. interim(HiveColumnHandle hiveColumnHandle, int index, Optional<HiveType> baseTypeCoercionFrom)static booleanHiveColumnHandle. isBucketColumnHandle(HiveColumnHandle column)static booleanHiveColumnHandle. isFileModifiedTimeColumnHandle(HiveColumnHandle column)static booleanHiveColumnHandle. isFileSizeColumnHandle(HiveColumnHandle column)static booleanHiveColumnHandle. isPartitionColumnHandle(HiveColumnHandle column)static booleanHiveColumnHandle. isPathColumnHandle(HiveColumnHandle column)static booleanHiveColumnHandle. isRowIdColumnHandle(HiveColumnHandle column)static HivePageSourceProvider.ColumnMappingHivePageSourceProvider.ColumnMapping. prefilled(HiveColumnHandle hiveColumnHandle, String prefilledValue, Optional<HiveType> baseTypeCoercionFrom)static HivePageSourceProvider.ColumnMappingHivePageSourceProvider.ColumnMapping. regular(HiveColumnHandle hiveColumnHandle, int index, Optional<HiveType> baseTypeCoercionFrom)static HivePageSourceProvider.ColumnMappingHivePageSourceProvider.ColumnMapping. synthesized(HiveColumnHandle hiveColumnHandle, int index, Optional<HiveType> baseTypeCoercionFrom)Method parameters in io.trino.plugin.hive with type arguments of type HiveColumnHandle Modifier and Type Method Description static Optional<ConnectorPageSource>HivePageSourceProvider. createHivePageSource(Set<HivePageSourceFactory> pageSourceFactories, Set<HiveRecordCursorProvider> cursorProviders, org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, OptionalInt bucketNumber, long start, long length, long estimatedFileSize, long fileModifiedTime, Properties schema, TupleDomain<HiveColumnHandle> effectivePredicate, List<HiveColumnHandle> columns, String partitionName, List<HivePartitionKey> partitionKeys, TypeManager typeManager, TableToPartitionMapping tableToPartitionMapping, Optional<HiveSplit.BucketConversion> bucketConversion, Optional<HiveSplit.BucketValidation> bucketValidation, boolean s3SelectPushdownEnabled, Optional<AcidInfo> acidInfo, boolean originalFile, AcidTransaction transaction)static Optional<ConnectorPageSource>HivePageSourceProvider. createHivePageSource(Set<HivePageSourceFactory> pageSourceFactories, Set<HiveRecordCursorProvider> cursorProviders, org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, OptionalInt bucketNumber, long start, long length, long estimatedFileSize, long fileModifiedTime, Properties schema, TupleDomain<HiveColumnHandle> effectivePredicate, List<HiveColumnHandle> columns, String partitionName, List<HivePartitionKey> partitionKeys, TypeManager typeManager, TableToPartitionMapping tableToPartitionMapping, Optional<HiveSplit.BucketConversion> bucketConversion, Optional<HiveSplit.BucketValidation> bucketValidation, boolean s3SelectPushdownEnabled, Optional<AcidInfo> acidInfo, boolean originalFile, AcidTransaction transaction)Optional<ReaderPageSource>HivePageSourceFactory. createPageSource(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, Optional<AcidInfo> acidInfo, OptionalInt bucketNumber, boolean originalFile, AcidTransaction transaction)Optional<ReaderPageSource>HivePageSourceFactory. createPageSource(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, Optional<AcidInfo> acidInfo, OptionalInt bucketNumber, boolean originalFile, AcidTransaction transaction)Optional<HiveRecordCursorProvider.ReaderRecordCursorWithProjections>GenericHiveRecordCursorProvider. createRecordCursor(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long fileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, TypeManager typeManager, boolean s3SelectPushdownEnabled)Optional<HiveRecordCursorProvider.ReaderRecordCursorWithProjections>GenericHiveRecordCursorProvider. createRecordCursor(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long fileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, TypeManager typeManager, boolean s3SelectPushdownEnabled)Optional<HiveRecordCursorProvider.ReaderRecordCursorWithProjections>HiveRecordCursorProvider. createRecordCursor(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long fileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, TypeManager typeManager, boolean s3SelectPushdownEnabled)Optional<HiveRecordCursorProvider.ReaderRecordCursorWithProjections>HiveRecordCursorProvider. createRecordCursor(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long fileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, TypeManager typeManager, boolean s3SelectPushdownEnabled)static HiveColumnHandleHiveUpdateProcessor. getUpdateRowIdColumnHandle(List<HiveColumnHandle> nonUpdatedColumnHandles)Return the column UPDATE column handle, which depends on the 3 ACID columns as well as the non-updated columns.List<Integer>HiveUpdateProcessor. makeDependencyChannelNumbers(List<HiveColumnHandle> dependencyColumns)List<Integer>HiveUpdateProcessor. makeNonUpdatedSourceChannels(List<HiveColumnHandle> dependencyColumns)List<HiveColumnHandle>HiveUpdateProcessor. mergeWithNonUpdatedColumns(List<HiveColumnHandle> updateDependencies)Merge the non-updated columns with the update dependencies, in allDataColumns order, and finally add the rowId column as the last dependency.static HivePartitionHivePartitionManager. parsePartition(SchemaTableName tableName, String partitionName, List<HiveColumnHandle> partitionColumns, List<Type> partitionColumnTypes)static booleanHivePartitionManager. partitionMatches(List<HiveColumnHandle> partitionColumns, TupleDomain<ColumnHandle> constraintSummary, HivePartition partition)static Optional<ReaderColumns>HivePageSourceProvider. projectBaseColumns(List<HiveColumnHandle> columns)Creates a mapping between the input {@param columns} and base columns if required.static Optional<ReaderColumns>HivePageSourceProvider. projectSufficientColumns(List<HiveColumnHandle> columns)Creates a set of sufficient columns for the input projected columns and prepares a mapping between the two.static HiveColumnHandleHiveColumnHandle. updateRowIdColumnHandle(List<HiveColumnHandle> columnHandles, List<ColumnHandle> updatedColumns)voidHiveStorageFormat. validateColumns(List<HiveColumnHandle> handles)Constructor parameters in io.trino.plugin.hive with type arguments of type HiveColumnHandle Constructor Description BucketConversion(HiveBucketing.BucketingVersion bucketingVersion, int tableBucketCount, int partitionBucketCount, List<HiveColumnHandle> bucketColumnHandles)BucketValidation(HiveBucketing.BucketingVersion bucketingVersion, int bucketCount, List<HiveColumnHandle> bucketColumns)GenericHiveRecordCursor(org.apache.hadoop.conf.Configuration configuration, org.apache.hadoop.fs.Path path, org.apache.hadoop.mapred.RecordReader<K,V> recordReader, long totalBytes, Properties splitSchema, List<HiveColumnHandle> columns)HiveBucketHandle(List<HiveColumnHandle> columns, HiveBucketing.BucketingVersion bucketingVersion, int tableBucketCount, int readBucketCount, List<SortingColumn> sortedBy)HiveInsertTableHandle(String schemaName, String tableName, List<HiveColumnHandle> inputColumns, HivePageSinkMetadata pageSinkMetadata, LocationHandle locationHandle, Optional<HiveBucketProperty> bucketProperty, HiveStorageFormat tableStorageFormat, HiveStorageFormat partitionStorageFormat, AcidTransaction transaction)HiveOutputTableHandle(String schemaName, String tableName, List<HiveColumnHandle> inputColumns, HivePageSinkMetadata pageSinkMetadata, LocationHandle locationHandle, HiveStorageFormat tableStorageFormat, HiveStorageFormat partitionStorageFormat, List<String> partitionedBy, Optional<HiveBucketProperty> bucketProperty, String tableOwner, Map<String,String> additionalTableParameters, AcidTransaction transaction, boolean external)HivePageSink(HiveWriterFactory writerFactory, List<HiveColumnHandle> inputColumns, Optional<HiveBucketProperty> bucketProperty, PageIndexerFactory pageIndexerFactory, HdfsEnvironment hdfsEnvironment, int maxOpenWriters, com.google.common.util.concurrent.ListeningExecutorService writeVerificationExecutor, io.airlift.json.JsonCodec<PartitionUpdate> partitionUpdateCodec, ConnectorSession session)HivePartitionResult(List<HiveColumnHandle> partitionColumns, Iterable<HivePartition> partitions, TupleDomain<HiveColumnHandle> compactEffectivePredicate, TupleDomain<ColumnHandle> unenforcedConstraint, TupleDomain<ColumnHandle> enforcedConstraint, Optional<HiveBucketHandle> bucketHandle, Optional<HiveBucketing.HiveBucketFilter> bucketFilter)HivePartitionResult(List<HiveColumnHandle> partitionColumns, Iterable<HivePartition> partitions, TupleDomain<HiveColumnHandle> compactEffectivePredicate, TupleDomain<ColumnHandle> unenforcedConstraint, TupleDomain<ColumnHandle> enforcedConstraint, Optional<HiveBucketHandle> bucketHandle, Optional<HiveBucketing.HiveBucketFilter> bucketFilter)HiveTableHandle(String schemaName, String tableName, List<HiveColumnHandle> partitionColumns, List<HiveColumnHandle> dataColumns, TupleDomain<HiveColumnHandle> compactEffectivePredicate, TupleDomain<ColumnHandle> enforcedConstraint, Optional<HiveBucketHandle> bucketHandle, Optional<HiveBucketing.HiveBucketFilter> bucketFilter, Optional<List<List<String>>> analyzePartitionValues, Optional<Set<String>> analyzeColumnNames, AcidTransaction transaction)HiveTableHandle(String schemaName, String tableName, List<HiveColumnHandle> partitionColumns, List<HiveColumnHandle> dataColumns, TupleDomain<HiveColumnHandle> compactEffectivePredicate, TupleDomain<ColumnHandle> enforcedConstraint, Optional<HiveBucketHandle> bucketHandle, Optional<HiveBucketing.HiveBucketFilter> bucketFilter, Optional<List<List<String>>> analyzePartitionValues, Optional<Set<String>> analyzeColumnNames, AcidTransaction transaction)HiveTableHandle(String schemaName, String tableName, Map<String,String> tableParameters, List<HiveColumnHandle> partitionColumns, List<HiveColumnHandle> dataColumns, Optional<HiveBucketHandle> bucketHandle)HiveTableHandle(String schemaName, String tableName, Optional<Map<String,String>> tableParameters, List<HiveColumnHandle> partitionColumns, List<HiveColumnHandle> dataColumns, Optional<List<HivePartition>> partitions, TupleDomain<HiveColumnHandle> compactEffectivePredicate, TupleDomain<ColumnHandle> enforcedConstraint, Optional<HiveBucketHandle> bucketHandle, Optional<HiveBucketing.HiveBucketFilter> bucketFilter, Optional<List<List<String>>> analyzePartitionValues, Optional<Set<String>> analyzeColumnNames, Optional<Set<ColumnHandle>> constraintColumns, Optional<Set<ColumnHandle>> projectedColumns, AcidTransaction transaction)HiveTableHandle(String schemaName, String tableName, Optional<Map<String,String>> tableParameters, List<HiveColumnHandle> partitionColumns, List<HiveColumnHandle> dataColumns, Optional<List<HivePartition>> partitions, TupleDomain<HiveColumnHandle> compactEffectivePredicate, TupleDomain<ColumnHandle> enforcedConstraint, Optional<HiveBucketHandle> bucketHandle, Optional<HiveBucketing.HiveBucketFilter> bucketFilter, Optional<List<List<String>>> analyzePartitionValues, Optional<Set<String>> analyzeColumnNames, Optional<Set<ColumnHandle>> constraintColumns, Optional<Set<ColumnHandle>> projectedColumns, AcidTransaction transaction)HiveUpdatablePageSource(HiveTableHandle hiveTableHandle, String partitionName, int statementId, ConnectorPageSource hivePageSource, TypeManager typeManager, OptionalInt bucketNumber, org.apache.hadoop.fs.Path bucketPath, boolean originalFile, OrcFileWriterFactory orcFileWriterFactory, org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, HiveType hiveRowType, List<HiveColumnHandle> dependencyColumns, AcidOperation updateKind, long initialRowId, long maxNumberOfRowsPerSplit)HiveUpdateProcessor(List<HiveColumnHandle> allDataColumns, List<HiveColumnHandle> updatedColumns)HiveWritableTableHandle(String schemaName, String tableName, List<HiveColumnHandle> inputColumns, HivePageSinkMetadata pageSinkMetadata, LocationHandle locationHandle, Optional<HiveBucketProperty> bucketProperty, HiveStorageFormat tableStorageFormat, HiveStorageFormat partitionStorageFormat, AcidTransaction transaction)HiveWriterFactory(Set<HiveFileWriterFactory> fileWriterFactories, String schemaName, String tableName, boolean isCreateTable, AcidTransaction transaction, List<HiveColumnHandle> inputColumns, HiveStorageFormat tableStorageFormat, HiveStorageFormat partitionStorageFormat, Map<String,String> additionalTableParameters, OptionalInt bucketCount, List<SortingColumn> sortedBy, LocationHandle locationHandle, LocationService locationService, String queryId, HivePageSinkMetadataProvider pageSinkMetadataProvider, TypeManager typeManager, HdfsEnvironment hdfsEnvironment, PageSorter pageSorter, io.airlift.units.DataSize sortBufferSize, int maxOpenSortFiles, org.joda.time.DateTimeZone parquetTimeZone, ConnectorSession session, NodeManager nodeManager, io.airlift.event.client.EventClient eventClient, HiveSessionProperties hiveSessionProperties, HiveWriterStats hiveWriterStats) -
Uses of HiveColumnHandle in io.trino.plugin.hive.metastore
Method parameters in io.trino.plugin.hive.metastore with type arguments of type HiveColumnHandle Modifier and Type Method Description static TupleDomain<String>MetastoreUtil. computePartitionKeyFilter(List<HiveColumnHandle> partitionKeys, TupleDomain<HiveColumnHandle> effectivePredicate)This method creates a TupleDomain for each partitionKey specifiedstatic TupleDomain<String>MetastoreUtil. computePartitionKeyFilter(List<HiveColumnHandle> partitionKeys, TupleDomain<HiveColumnHandle> effectivePredicate)This method creates a TupleDomain for each partitionKey specified -
Uses of HiveColumnHandle in io.trino.plugin.hive.orc
Method parameters in io.trino.plugin.hive.orc with type arguments of type HiveColumnHandle Modifier and Type Method Description Optional<ReaderPageSource>OrcPageSourceFactory. createPageSource(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, Optional<AcidInfo> acidInfo, OptionalInt bucketNumber, boolean originalFile, AcidTransaction transaction)Optional<ReaderPageSource>OrcPageSourceFactory. createPageSource(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, Optional<AcidInfo> acidInfo, OptionalInt bucketNumber, boolean originalFile, AcidTransaction transaction)static OrcPageSource.ColumnAdaptationOrcPageSource.ColumnAdaptation. updatedRowColumns(HiveUpdateProcessor updateProcessor, List<HiveColumnHandle> dependencyColumns)static OrcPageSource.ColumnAdaptationOrcPageSource.ColumnAdaptation. updatedRowColumnsWithOriginalFiles(long startingRowId, int bucketId, HiveUpdateProcessor updateProcessor, List<HiveColumnHandle> dependencyColumns) -
Uses of HiveColumnHandle in io.trino.plugin.hive.parquet
Fields in io.trino.plugin.hive.parquet declared as HiveColumnHandle Modifier and Type Field Description static HiveColumnHandleParquetPageSourceFactory. PARQUET_ROW_INDEX_COLUMNIf this object is passed as one of the columns forcreatePageSource, it will be populated as an additional column containing the index of each row read.Methods in io.trino.plugin.hive.parquet with parameters of type HiveColumnHandle Modifier and Type Method Description static Optional<org.apache.parquet.schema.Type>ParquetPageSourceFactory. getColumnType(HiveColumnHandle column, org.apache.parquet.schema.MessageType messageType, boolean useParquetColumnNames)static Optional<org.apache.parquet.schema.Type>ParquetPageSourceFactory. getParquetType(org.apache.parquet.schema.GroupType groupType, boolean useParquetColumnNames, HiveColumnHandle column)Method parameters in io.trino.plugin.hive.parquet with type arguments of type HiveColumnHandle Modifier and Type Method Description Optional<ReaderPageSource>ParquetPageSourceFactory. createPageSource(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, Optional<AcidInfo> acidInfo, OptionalInt bucketNumber, boolean originalFile, AcidTransaction transaction)Optional<ReaderPageSource>ParquetPageSourceFactory. createPageSource(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, Optional<AcidInfo> acidInfo, OptionalInt bucketNumber, boolean originalFile, AcidTransaction transaction)static ReaderPageSourceParquetPageSourceFactory. createPageSource(org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, boolean useColumnNames, HdfsEnvironment hdfsEnvironment, org.apache.hadoop.conf.Configuration configuration, ConnectorIdentity identity, org.joda.time.DateTimeZone timeZone, FileFormatDataSourceStats stats, ParquetReaderOptions options)This method is available for other callers to use directly.static ReaderPageSourceParquetPageSourceFactory. createPageSource(org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, boolean useColumnNames, HdfsEnvironment hdfsEnvironment, org.apache.hadoop.conf.Configuration configuration, ConnectorIdentity identity, org.joda.time.DateTimeZone timeZone, FileFormatDataSourceStats stats, ParquetReaderOptions options)This method is available for other callers to use directly.static TupleDomain<org.apache.parquet.column.ColumnDescriptor>ParquetPageSourceFactory. getParquetTupleDomain(Map<List<String>,RichColumnDescriptor> descriptorsByPath, TupleDomain<HiveColumnHandle> effectivePredicate, org.apache.parquet.schema.MessageType fileSchema, boolean useColumnNames) -
Uses of HiveColumnHandle in io.trino.plugin.hive.rcfile
Method parameters in io.trino.plugin.hive.rcfile with type arguments of type HiveColumnHandle Modifier and Type Method Description Optional<ReaderPageSource>RcFilePageSourceFactory. createPageSource(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, Optional<AcidInfo> acidInfo, OptionalInt bucketNumber, boolean originalFile, AcidTransaction transaction)Optional<ReaderPageSource>RcFilePageSourceFactory. createPageSource(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long estimatedFileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, Optional<AcidInfo> acidInfo, OptionalInt bucketNumber, boolean originalFile, AcidTransaction transaction)Constructor parameters in io.trino.plugin.hive.rcfile with type arguments of type HiveColumnHandle Constructor Description RcFilePageSource(RcFileReader rcFileReader, List<HiveColumnHandle> columns) -
Uses of HiveColumnHandle in io.trino.plugin.hive.s3select
Method parameters in io.trino.plugin.hive.s3select with type arguments of type HiveColumnHandle Modifier and Type Method Description StringIonSqlQueryBuilder. buildSql(List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> tupleDomain)StringIonSqlQueryBuilder. buildSql(List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> tupleDomain)Optional<HiveRecordCursorProvider.ReaderRecordCursorWithProjections>S3SelectRecordCursorProvider. createRecordCursor(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long fileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, TypeManager typeManager, boolean s3SelectPushdownEnabled)Optional<HiveRecordCursorProvider.ReaderRecordCursorWithProjections>S3SelectRecordCursorProvider. createRecordCursor(org.apache.hadoop.conf.Configuration configuration, ConnectorSession session, org.apache.hadoop.fs.Path path, long start, long length, long fileSize, Properties schema, List<HiveColumnHandle> columns, TupleDomain<HiveColumnHandle> effectivePredicate, TypeManager typeManager, boolean s3SelectPushdownEnabled) -
Uses of HiveColumnHandle in io.trino.plugin.hive.util
Methods in io.trino.plugin.hive.util that return types with arguments of type HiveColumnHandle Modifier and Type Method Description static List<HiveColumnHandle>HiveUtil. getPartitionKeyColumnHandles(Table table, TypeManager typeManager)static List<HiveColumnHandle>HiveUtil. getRegularColumnHandles(Table table, TypeManager typeManager, HiveTimestampPrecision timestampPrecision)static List<HiveColumnHandle>HiveUtil. hiveColumnHandles(Table table, TypeManager typeManager, HiveTimestampPrecision timestampPrecision)Methods in io.trino.plugin.hive.util with parameters of type HiveColumnHandle Modifier and Type Method Description static StringHiveUtil. getPrefilledColumnValue(HiveColumnHandle columnHandle, HivePartitionKey partitionKey, org.apache.hadoop.fs.Path path, OptionalInt bucketNumber, long fileSize, long fileModifiedTime, String partitionName)Method parameters in io.trino.plugin.hive.util with type arguments of type HiveColumnHandle Modifier and Type Method Description static org.apache.hadoop.mapred.RecordReader<?,?>HiveUtil. createRecordReader(org.apache.hadoop.conf.Configuration configuration, org.apache.hadoop.fs.Path path, long start, long length, Properties schema, List<HiveColumnHandle> columns)Constructor parameters in io.trino.plugin.hive.util with type arguments of type HiveColumnHandle Constructor Description InternalHiveSplitFactory(org.apache.hadoop.fs.FileSystem fileSystem, String partitionName, org.apache.hadoop.mapred.InputFormat<?,?> inputFormat, Properties schema, List<HivePartitionKey> partitionKeys, TupleDomain<HiveColumnHandle> effectivePredicate, BooleanSupplier partitionMatchSupplier, TableToPartitionMapping tableToPartitionMapping, Optional<HiveSplit.BucketConversion> bucketConversion, Optional<HiveSplit.BucketValidation> bucketValidation, io.airlift.units.DataSize minimumTargetSplitSize, boolean forceLocalScheduling, boolean s3SelectPushdownEnabled, AcidTransaction transaction)
-