All Classes Interface Summary Class Summary Enum Summary Exception Summary
| Class |
Description |
| AAttributeValue<D> |
An abstract class for attribute values implementing basic functionality to
store its value as well as getter and setters.
|
| Abandonable |
Interface for Distance measures that can make use of the Early Abandon
technique.
|
| ABatchLearner<T,V,I,D extends IDataset<I>> |
Abstract extension of IBatchLearner to be able to construct
prediction of the given type.
|
| AbstractAugmentedSpaceSampler |
|
| AbstractDyadScaler |
A scaler that can be fit to a certain dataset and then be used to standardize
datasets, i.e. transform the data to have a mean of 0 and a standard
deviation of 1 according to the data it was fit to.
|
| AbstractSplitBasedClassifierEvaluator<I,O> |
Connection between an Evaluator (e.g.
|
| AccessibleRandomTree |
Random Tree extension providing leaf node information of the constructed
tree.
|
| ActiveDyadRanker |
Abstract description of a pool-based active learning strategy for dyad
ranking.
|
| ADecomposableDoubleMeasure<I> |
A measure that is decomposable by instances and aggregated by averaging.
|
| ADecomposableMeasure<I,O> |
A measure that is aggregated from e.g. instance-wise computations of the respective measure and which is then aggregated, e.g. taking the mean.
|
| ADecomposableMultilabelMeasure |
|
| ADerivateFilter |
Abstract superclass for all derivate filters.
|
| ADyadRankedNodeQueue<N,V extends java.lang.Comparable<V>> |
A queue whose elements are nodes, sorted by a dyad ranker.
|
| ADyadRankedNodeQueueConfig<N> |
A configuration for a dyad ranked node queue.
|
| ADyadRankingInstance |
|
| AFileSamplingAlgorithm |
An abstract class for file-based sampling algorithms providing basic
functionality of an algorithm.
|
| AFilter |
|
| AggressiveAggregator |
An IntervalAggregator that makes predictions using the minimum of the
predictions as the lower bound and the maximum as the upper bound.
|
| AILabeledAttributeArrayDataset<I extends ILabeledAttributeArrayInstance<L>,L> |
Common interface of a dataset defining methods to access meta-data and
instances contained in the dataset.
|
| AINumericLabeledAttributeArrayDataset<I extends INumericLabeledAttributeArrayInstance<L>,L> |
|
| AllPairsTable |
|
| AMCTreeNode<C> |
|
| AMinimumDistanceSearchStrategy |
Abstract class for minimum distance search strategies.
|
| AMonteCarloCrossValidationBasedEvaluatorFactory |
An abstract factory for configuring Monte Carlo cross-validation based evaluators.
|
| AnalyticalLearningCurve |
Added some analytical functions to a learning curve.
|
| AOnlineLearner<T,V,I,D extends IDataset<I>> |
Abstract extension of IOnlineLearner to be able to construct
prediction of the given type.
|
| APredictiveModel<T,V,I,D extends IDataset<I>> |
Abstract extension of IPredictiveModel to be able to construct
prediction of the given type.
|
| AProcessListener |
The process listener may be attached to a process in order to handle its ouputs streams in a controlled way.
|
| ARandomlyInitializingDyadRanker |
|
| ArbitrarySplitter |
Generates a purely random split of the dataset depending on the seed and on the portions provided.
|
| ArffUtilities |
Utility class for handling Arff dataset files.
|
| ASamplingAlgorithm |
An abstract class for sampling algorithms providing basic functionality of an algorithm.
|
| ASamplingAlgorithm<I,D extends IDataset<I>> |
An abstract class for sampling algorithms providing basic functionality of an
algorithm.
|
| ASimpleInstancesImpl<I> |
|
| ASimplifiedTSClassifier<T> |
Simplified batch-learning time series classifier which can be trained and
used as a predictor.
|
| ASimplifiedTSCLearningAlgorithm<T,C extends ASimplifiedTSClassifier<T>> |
|
| ASquaredErrorLoss |
Measure computing the squared error of two doubles.
|
| ATransformFilter |
Abstract superclass for all transform filters.
|
| ATSCAlgorithm<L,V,D extends TimeSeriesDataset<L>,C extends TSClassifier<L,V,D>> |
|
| AttributeBasedStratiAmountSelectorAndAssigner<I extends ILabeledAttributeArrayInstance<?>,D extends IOrderedLabeledAttributeArrayDataset<I,?>> |
This class is responsible for computing the amount of strati in
attribute-based stratified sampling and assigning elements to the strati.
|
| AttributeDiscretizationPolicy |
|
| AugSpaceAllPairs |
|
| AutoMekaGGPFitness |
|
| AutoMEKAGGPFitnessMeasure |
Fitness function for a linear combination of 4 well-known multi-label metrics: ExactMatch, Hamming, Rank and F1MacroAverageL.
|
| AutoMEKAGGPFitnessMeasureLoss |
Measure combining exact match, hamming loss, f1macroavgL and rankloss.
|
| AWeightedTrigometricDistance |
|
| BackwardDifferenceDerivate |
Filter that calculate the Backward Difference derivate.
|
| BilinFunction |
Wraps the NLL optimizing problem into the QNMinimizer optimizer.
|
| BiliniearFeatureTransform |
Implementation of the feature transformation method using the Kroenecker
Product.
|
| BlackBoxGradient |
Difference quotient based gradient estimation.
|
| BooleanAttributeType |
The boolean attribute type.
|
| BooleanAttributeValue |
Numeric attribute value as it can be part of an instance.
|
| BOSSClassifier |
|
| BOSSEnsembleClassifier |
|
| BOSSLearningAlgorithm |
|
| BOSSLearningAlgorithm.IBossAlgorithmConfig |
|
| CaseControlLikeSampling<I extends ILabeledInstance<?>,D extends IDataset<I>> |
|
| CaseControlSampling<I extends ILabeledInstance<?>,D extends IDataset<I>> |
Case control sampling.
|
| CaseControlSamplingFactory<I extends ILabeledInstance<?>,D extends IDataset<I>> |
|
| CategoricalAttributeType |
The categorical attribute type describes the domain a value of a respective categorical attribute value stems from.
|
| CategoricalAttributeValue |
Categorical attribute value as it can be part of an instance.
|
| CategoricalFeatureDomain |
Description of a categorical feature domain.
|
| CheckedJaicoreMLException |
|
| ChooseKAugSpaceSampler |
Samples interval-valued data from a dataset of precise points by sampling k precise points (with replacement)
and generating a point in the interval-valued augmented space by only considering those k points, i.e. choosing
respective minima and maxima for each attribute from the chosen precise points.
|
| ClassifierCache |
|
| ClassifierEvaluatorConstructionFailedException |
|
| ClassifierMetricGetter |
Class for getting metrics by their name for single- and multilabel
classifiers.
|
| ClassifierRankingForGroup |
|
| ClassifierWeightedSampling<I extends ILabeledInstance<?>,D extends IOrderedDataset<I>> |
The idea behind this Sampling method is to weight instances depended on the
way a pilot estimator p classified them.
|
| ClassMapper |
Class mapper used for predictions of String objects which are internally
predicted by time series classifiers as ints.
|
| ClassStratiFileAssigner |
|
| Cluster |
|
| ClusterSampling<I extends INumericLabeledAttributeArrayInstance<? extends java.lang.Number>,D extends IDataset<I>> |
|
| ClusterStratiAssigner<I extends INumericArrayInstance,D extends IDataset<I>> |
|
| ComplexityInvariantDistance |
Implementation of the Complexity Invariant Distance (CID) measure as
published in "A Complexity-Invariant Distance Measure for Time Series" by
Gustavo E.A.P.A.
|
| ConfidenceIntervalClusteringBasedActiveDyadRanker |
A prototypical active dyad ranker based on clustering of pseudo confidence
intervals.
|
| ConfigurationException |
|
| ConfigurationLearningCurveExtrapolationEvaluator |
Predicts the accuracy of a classifier with certain configurations on a point
of its learning curve, given some anchorpoint and its configurations using
the LCNet of pybnn
Note: This code was copied from LearningCurveExtrapolationEvaluator and
slightly reworked
|
| ConfigurationLearningCurveExtrapolator<I extends ILabeledAttributeArrayInstance<?>,D extends IOrderedLabeledAttributeArrayDataset<I,?>> |
This class is a subclass of LearningCurveExtrapolator which deals
with the slightly different setup that is required by the LCNet
of pybnn
|
| ConstantClassifier |
|
| ContainsNonNumericAttributesException |
|
| CosineTransform |
Calculates the cosine transform of a time series.
|
| DataProvider |
|
| DatasetCapacityReachedException |
|
| DatasetCharacterizerInitializationFailedException |
An exception that signifies something went wrong during the initialization of
a dataset characterizer
|
| DatasetCreationException |
|
| DatasetFileSorter |
Sorts a Dataset file with a Mergesort.
|
| DefaultProcessListener |
The DefaultProcessListener might be used to forward any type of outputs of a process to a logger.
|
| DerivateDistance |
Implementation of the Derivate Distance (DD) measure as published in "Using
derivatives in time series classification" by Tomasz Gorecki and Maciej
Luczak (2013).
|
| DerivateTransformDistance |
Implementation of the Derivate Transform Distance (TD) measure as published
in "Non-isometric transforms in time series classification using DTW" by
Tomasz Gorecki and Maciej Luczak (2014).
|
| DFT |
|
| DiscretizationHelper<D extends AILabeledAttributeArrayDataset<?,?>> |
This helper class provides methods that are required in order to discretize
numeric attributes.
|
| DiscretizationHelper.DiscretizationStrategy |
|
| Dyad |
Represents a dyad consisting of an instance and an alternative, represented
by feature vectors.
|
| DyadDatasetPoolProvider |
|
| DyadMinMaxScaler |
A scaler that can be fit to a certain dataset and then be used to normalize
dyad datasets, i.e. transform the data such that the values of each feature
lie between 0 and 1.
|
| DyadRankingDataset |
A dataset representation for dyad ranking.
|
| DyadRankingFeatureTransformNegativeLogLikelihood |
Implements the negative log-likelihood function for the feature
transformation Placket-Luce dyad ranker.
|
| DyadRankingFeatureTransformNegativeLogLikelihoodDerivative |
Represents the derivate of the negative log likelihood function in the
context of feature transformation Placket-Luce dyad ranking [1].
|
| DyadRankingInstance |
A general implementation of a dyad ranking instance that contains an
immutable list of dyad to represent the ordering of dyads.
|
| DyadRankingLossFunction |
Loss function for evaluating dyad rankers.
|
| DyadRankingLossUtil |
Class that contains utility methods for handling dyad ranking losses.
|
| DyadRankingMLLossFunctionWrapper |
A wrapper for dyad ranking loss that enables already implemented multi label
classification loss functions to be used in this context.
|
| DyadStandardScaler |
A scaler that can be fit to a certain dataset and then be used to standardize
datasets, i.e. transform the data to have a mean of 0 and a standard
deviation of 1 according to the data it was fit to.
|
| DyadUnitIntervalScaler |
A scaler that can be fit to a certain dataset and then be used to normalize
datasets, i.e. transform the data to have a length of 1.
|
| DynamicTimeWarping |
Implementation of the Dynamic Time Warping (DTW) measure as published in
"Using Dynamic Time Warping to FindPatterns in Time Series" Donald J.
|
| EarlyAbandonMinimumDistanceSearchStrategy |
Class implementing a search strategy used for finding the minimum distance of
a Shapelet object to a time series.
|
| EMCNodeType |
|
| EMulticlassMeasure |
Enum summarizing all multiclass measures.
|
| EMultiClassPerformanceMeasure |
|
| EMultilabelPerformanceMeasure |
|
| Ensemble |
|
| EnsembleProvider |
Class statically providing preconfigured ensembles as commonly used in TSC
implementations.
|
| EuclideanDistance |
Implementation of the Euclidean distance for time series.
|
| EvaluationException |
|
| ExactIntervalAugSpaceSampler |
Samples interval-valued data from a dataset of precise points.
|
| ExactMatchAccuracy |
Computes the exact match of the predicted multi label vector and the expected.
|
| ExactMatchLoss |
|
| ExhaustiveMinimumDistanceSearchStrategy |
Class implementing a search strategy used for finding the minimum distance of
a Shapelet object to a time series.
|
| ExtendedM5Forest |
|
| ExtendedM5Tree |
|
| ExtendedRandomForest |
|
| ExtendedRandomTree |
Extension of a classic RandomTree to predict intervals.
|
| ExtrapolatedSaturationPointEvaluator<I extends ILabeledAttributeArrayInstance<?>,D extends IOrderedLabeledAttributeArrayDataset<I,?>> |
For the classifier a learning curve will be extrapolated with a given set of
anchorpoints.
|
| ExtrapolatedSaturationPointEvaluatorFactory |
|
| ExtrapolationRequest |
This class describes the request that is sent to an Extrapolation Service.
|
| ExtrapolationServiceClient<C> |
This class describes the client that is responsible for the communication
with an Extrapolation Service.
|
| F1MacroAverageL |
|
| F1MacroAverageLLoss |
Compute the inverted F1 measure macro averaged by label.
|
| FeatureDomain |
Abstract description of a feature domain.
|
| FeatureSpace |
|
| FeatureTransformPLDyadRanker |
A feature transformation Placket-Luce dyad ranker.
|
| FixedSplitClassifierEvaluator |
|
| FoldBasedSubsetInstruction |
|
| ForwardDifferenceDerivate |
Filter that calculate the Forward Difference derivate.
|
| FStat |
F-Stat quality measure performing a analysis of variance according to chapter
3.2 of the original paper.
|
| GlobalCharacterizer |
Characterizer that applies a number of Characterizers to a data set.
|
| GMeans<C extends org.apache.commons.math3.ml.clustering.Clusterable> |
Implementation of Gmeans based on Helen Beierlings implementation of
GMeans(https://github.com/helebeen/AILibs/blob/master/JAICore/jaicore-modifiedISAC/src/main/java/jaicore/modifiedISAC/ModifiedISACgMeans.java).
For more Information see: "Hamerly, G., and Elkan, C. 2003.
|
| GmeansSampling<I extends INumericLabeledAttributeArrayInstance<? extends java.lang.Number>,D extends IDataset<I>> |
Implementation of a sampling method using gmeans-clustering.
|
| GmeansSamplingFactory<I extends INumericLabeledAttributeArrayInstance<? extends java.lang.Number>,D extends IDataset<I>> |
|
| GMeansStratiAmountSelectorAndAssigner<I extends INumericArrayInstance,D extends IDataset<I>> |
Combined strati amount selector and strati assigner via g-means.
|
| GradientDescentOptimizer |
An optimizer based on the gradient descent method [1].
|
| GradientDescentOptimizerConfig |
|
| Group<C,I> |
Group.java - Stores a group with it center as ID and the associated instances
|
| GroupBasedRanker<C,I,S> |
|
| GroupIdentifier<C> |
|
| GulloDerivate |
Calculates the derivative of a timeseries as described first by Gullo et. al
(2009).
|
| HammingAccuracy |
Measure for computing how similar two double vectors are according to hamming distance.
|
| HammingLoss |
|
| HellFormater |
|
| HighProbClassifier |
|
| HilbertTransform |
Calculates the Hilbert transform of a time series.
|
| HistogramBuilder |
|
| IActiveLearningPoolProvider<I extends ILabeledInstance> |
Provides a sample pool for pool-based active learning.
|
| IAttributeArrayInstance |
Interface of an instance that consists of attributes.
|
| IAttributeType<D> |
Wrapper interface for attribute types.
|
| IAttributeValue<D> |
A general interface for attribute values.
|
| IAugmentedSpaceSampler |
Interface representing a class that samples interval-valued data from a set of precise data points.
|
| IAugSpaceSamplingFunction |
|
| IBatchLearner<T,I,D extends IDataset<I>> |
|
| ICategoricalAttributeType |
Interface for categorical attribute types.
|
| ICertaintyProvider<T,I,D extends IDataset<I>> |
|
| IClassifierEvaluator |
|
| IClassifierEvaluatorFactory |
|
| IDataset<I> |
|
| IDatasetSplitter |
|
| IDistanceMetric<D,A,B> |
|
| IDyadFeatureTransform |
|
| IDyadRanker |
An abstract representation of a dyad ranker.
|
| IDyadRankingFeatureTransformPLGradientDescendableFunction |
An interface for a differentiable function in the context of feature
transformation Placket-Luce dyad ranking.
|
| IDyadRankingFeatureTransformPLGradientFunction |
Represents a differentiable function in the context of dyad ranking based on
feature transformation Placket-Luce models.
|
| IDyadRankingInstance |
|
| IDyadRankingPoolProvider |
Interface for an active learning pool provider in the context of dyad
ranking.
|
| IFilter |
|
| IGradientBasedOptimizer |
Interface for an optimizer that is based on a gradient descent and gets a
differentiable function and the derivation of said function to solve an
optimization problem.
|
| IGradientDescendableFunction |
This interface represents a function that is differentiable and thus can be
used by gradient descent algorithms.
|
| IGradientFunction |
Represents the gradient of a function that is differentiable.
|
| IGroupBuilder<C,I> |
IGroupBuilder discribes the act of building groups out of probleminstances
|
| IGroupSolutionRankingSelect<C,S,I,P> |
|
| IInstance |
Interface of an instance which consists of attributes and a target value.
|
| IInstanceCollector<I> |
|
| IInstancesClassifier |
|
| ILabeledAttributeArrayDataset<L> |
|
| ILabeledAttributeArrayInstance<L> |
Type intersection for IAttributeArrayInstance and ILabeledInstance
|
| ILabeledInstance<T> |
Interface of an instance that has a target value.
|
| ILearnShapeletsLearningAlgorithmConfig |
|
| IMeasure<I,O> |
The interface of a measure which compute a value of O from expected and actual values of I.
|
| IModifiableInstance |
|
| IMultiClassClassificationExperimentConfig |
|
| IMultilabelCrossValidation |
Represents an algorithm that realizes a split of a given multilabel instances in folds, given a seed, custom information about the split represented as a string, and the fold that is left out for testing.
|
| IMultilabelMeasure |
Interface for measures dealing with multilabel data.
|
| IMultiValueAttributeType |
Interface for categorical attribute types.
|
| InputOptimizerLoss |
|
| InputOptListener |
|
| Instance |
|
| Instances<I> |
|
| InstanceSchema<L> |
|
| InstanceWiseF1 |
Instance-wise F1 measure for multi-label classifiers.
|
| InstanceWiseF1AsLoss |
The F1 Macro Averaged by the number of instances measure.
|
| Instruction |
Instruction class that can be converted into json.
|
| InstructionFailedException |
|
| InstructionGraph |
|
| InstructionNode |
|
| IntervalAggregator |
An IntervalAggeregator can aggregate from a list of intervals, more precisely
given a list of predictions in the leaf node, it can predict a range.
|
| INumericArrayInstance |
|
| INumericLabeledAttributeArrayInstance<L> |
Type intersection interface for numeric instances on one hand and labeled instances on the other hand.
|
| INumericLabeledIAttributeDataset<L> |
|
| InvalidAnchorPointsException |
Exception that is thrown, when the anchorpoints generated for learning curve
extrapolation are not suitable.
|
| InversePowerLawConfiguration |
This class encapsulates the three parameters that are required in order to
create a Inverse Power Law function.
|
| InversePowerLawExtrapolationMethod |
This class describes a method for learning curve extrapolation which
generates an Inverse Power Law function.
|
| InversePowerLawLearningCurve |
Representation of a learning curve with the Inverse Power Law function, which has three parameters named a, b and c.
|
| IOnlineLearner<T,I,D extends IDataset<I>> |
The IOnlineLearner models a learning algorithm which works in an
online fashion, i.e. takes either a single IInstance or a Set
thereof as training input.
|
| IOrderedDataset<I> |
|
| IOrderedLabeledAttributeArrayDataset<I extends ILabeledAttributeArrayInstance<L>,L> |
Extends the IDataset by including the List interface.
|
| IOrderedLabeledDataset<I extends ILabeledInstance<L>,L> |
|
| IPipelineEvaluationConf |
|
| IPLDyadRanker |
An abstract representation for a dyad ranker using Placket Luce models.
|
| IPLNetDyadRankerConfiguration |
|
| IPredictiveModel<T,I,D extends IDataset<I>> |
|
| IPredictiveModelConfiguration |
|
| IPrimitiveAttributeType<D> |
Interface for categorical attribute types.
|
| IProcessListener |
|
| IQualityMeasure |
Interface for a quality measure assessing distances of instances to a
shapelet given the corresponding class values.
|
| IRankedSolutionCandidateProvider<I,S> |
|
| IRerunnableSamplingAlgorithmFactory<I,D extends IDataset<I>,A extends ASamplingAlgorithm<I,D>> |
Extension of the ISamplingAlgorithmFactory for sampling algorithms that can
re-use informations from a previous run of the Sampling algorithm.
|
| ISamplingAlgorithm<D extends IDataset<?>> |
Interface for sampling algorithms.
|
| ISamplingAlgorithm |
Interface for sampling algorithms.
|
| ISamplingAlgorithmFactory<I,D extends IDataset<I>,A extends ASamplingAlgorithm<I,D>> |
Interface for a factory, which creates a sampling algorithm.
|
| IScalarDistance |
Functional interface for the distance of two scalars.
|
| ISelectiveSamplingStrategy<I> |
A strategy for selective sampling.
|
| ISingleAttributeTransformer |
|
| ISplitBasedClassifierEvaluator<O> |
Interface for the evaluator measure bridge yielding the measured value as an instance of O.
|
| ISplitter |
|
| ISplitterFactory<T extends ISplitter> |
|
| IStratiAmountSelector<D extends IDataset<?>> |
Functional interface to write custom logic for selecting the amount of strati
for a dataset.
|
| IStratiAssigner<I,D extends IDataset<I>> |
Interface to write custom Assigner for datapoints to strati.
|
| IStratiFileAssigner |
Interface to implement custom Stratum assignment behavior.
|
| ITableGeneratorandCompleter<I,S,P> |
|
| ITimeSeriesComplexity |
Interface that describes the complexity measure of a time series.
|
| ITimeSeriesDistance |
Interface that describes a distance measure of two time series.
|
| ITimeSeriesDistanceWithTimestamps |
Interface that describes a distance measure of two time series that takes the
timestamps into account.
|
| ITreeClassifier |
|
| JaccardLoss |
|
| JaccardScore |
|
| KendallsTauDyadRankingLoss |
Computes the rank correlation measure known as Kendall's tau coefficient, i.e.
|
| KendallsTauOfTopK |
Calculates the kendalls-tau loss only for the top k dyads.
|
| KeoghDerivate |
Calculates the derivative of a timeseries as described first by Keogh and
Pazzani (2001).
|
| Kmeans<A,D> |
|
| KmeansSampling<I extends INumericLabeledAttributeArrayInstance<? extends java.lang.Number>,D extends IDataset<I>> |
Implementation of a sampling method using kmeans-clustering.
|
| KmeansSamplingFactory<I extends INumericLabeledAttributeArrayInstance<? extends java.lang.Number>,D extends IDataset<I>> |
|
| KMeansStratiAssigner<I extends INumericArrayInstance,D extends IDataset<I>> |
Cluster the data set with k-means into k Clusters, where each cluster stands
for one stratum.
|
| KNNAugSpaceSampler |
Samples interval-valued data from a dataset of precise points.
|
| L1DistanceMetric |
|
| LabeledInstance<L> |
|
| LabeledInstances<L> |
|
| LandmarkerCharacterizer |
A Characterizer that applies several characterizers to a data set, but does
not use any probing.
|
| LatexDatasetTableGenerator |
|
| LCNetClient |
|
| LCNetExtrapolationMethod |
This class represents a learning curve extrapolation using the LCNet
from pybnn.
|
| LearningCurve |
Interface for the result of an learning curve extrapolation.
|
| LearningCurveExtrapolatedEvent |
|
| LearningCurveExtrapolationEvaluator<I extends ILabeledAttributeArrayInstance<?>,D extends IOrderedLabeledAttributeArrayDataset<I,?>> |
Evaluates a classifier by predicting its learning curve with a few
anchorpoints.
|
| LearningCurveExtrapolationEvaluatorFactory |
|
| LearningCurveExtrapolationMethod |
Functional interface for extrapolating a learning curve from anchorpoints.
|
| LearningCurveExtrapolator<I extends ILabeledAttributeArrayInstance<?>,D extends IOrderedLabeledAttributeArrayDataset<I,?>> |
Abstract class for implementing a learning curve extrapolation method with
some anchor points.
|
| LearnPatternSimilarityClassifier |
Class representing the Learn Pattern Similarity classifier as described in
Baydogan, Mustafa & Runger, George. (2015).
|
| LearnPatternSimilarityLearningAlgorithm |
|
| LearnPatternSimilarityLearningAlgorithm.IPatternSimilarityConfig |
|
| LearnShapeletsClassifier |
LearnShapeletsClassifier published in "J.
|
| LearnShapeletsLearningAlgorithm |
Generalized Shapelets Learning implementation for
LearnShapeletsClassifier published in "J.
|
| LearnShapeletsLearningAlgorithm.ILearnShapeletsLearningAlgorithmConfig |
|
| LinearCombinationConstants |
This class contains required constant names for the linear combination
learning curve.
|
| LinearCombinationExtrapolationMethod |
This class describes a method for learning curve extrapolation which
generates a linear combination of suitable functions.
|
| LinearCombinationFunction |
This is a basic class that describes a function which is a weighted
combination of individual functions.
|
| LinearCombinationLearningCurve |
The LinearCombinationLearningCurve consists of the actual linear combination
function that describes the learning curve, as well as the derivative of this
function.
|
| LinearCombinationLearningCurveConfiguration |
A configuration for a linear combination learning curve consists of
parameterizations for at least one linear combination function.
|
| LinearCombinationParameterSet |
This class encapsulates all parameters that are required in order to create a
weighted linear combination of parameterized functions.
|
| LoadDataSetInstruction |
Instruction for dataset loading, provider and id are used to identify the data set.
|
| LoadDataSetInstructionForARFFFile |
|
| LoadDatasetInstructionForOpenML |
|
| LocalCaseControlSampling<I extends ILabeledAttributeArrayInstance<?>,D extends IDataset<I>> |
|
| LocalCaseControlSamplingFactory<I extends ILabeledAttributeArrayInstance<?>,D extends IDataset<I>> |
|
| LossScoreTransformer<I> |
This transformer transforms a decomposable double measure from a scoring function to a loss or vice versa.
|
| MajorityConfidenceVote |
Vote implementation for majority confidence.
|
| ManhattanDistance |
Implementation of the Manhattan distance for time series.
|
| MathUtil |
Utility class consisting of mathematical utility functions.
|
| MCCVSplitEvaluationEvent |
|
| MCTreeMergeNode |
|
| MCTreeNode |
|
| MCTreeNodeLeaf |
|
| MCTreeNodeReD |
|
| MCTreeNodeReDLeaf |
|
| MeanSquaredErrorLoss |
|
| MeasureAggregatedComputationEvent<INPUT,OUTPUT> |
|
| MeasureAvgComputationEvent<INPUT,OUTPUT> |
|
| MeasureListComputationEvent<INPUT,OUTPUT> |
|
| MeasureSingleComputationEvent<INPUT,OUTPUT> |
|
| MinHashingTransformer |
Converts the sets of multi-value features to short signatures.
|
| MLExperiment |
|
| ModelBuildFailedException |
|
| ModifiedISAC |
|
| ModifiedISACEvaluator |
|
| ModifiedISACgMeans |
|
| ModifiedISACGroupBuilder |
|
| ModifiedISACInstanceCollector |
|
| ModifiedISACkMeans |
|
| MonteCarloCrossValidationEvaluator |
A classifier evaluator that can perform a (monte-carlo)cross-validation on
the given dataset.
|
| MonteCarloCrossValidationEvaluatorFactory |
Factory for configuring standard Monte Carlo cross-validation evaluators.
|
| MoveSplitMerge |
Implementation of the Move-Split-Merge (MSM) measure as published in "The
Move-Split-Merge Metric for Time Series" by Alexandra Stefan, Vassilis
Athitsos and Gautam Das (2013).
|
| MulticlassClassStratifiedSplitter |
Makes use of the WekaUtil to split the data into a class-oriented stratified split preserving the class distribution.
|
| MultiClassMeasureBuilder |
|
| MultilabelDatasetSplitter |
This class provides methods to obtain train and test splits for a given data
set and split technique.
|
| MultiValueAttributeType |
The multi-value attribute type describes the domain a value of a respective multi-value attribute value stems from.
|
| MultiValueAttributeValue |
Multi-value attribute value as it can be part of an instance.
|
| MultiValueBinaryzationTransformer |
Transforms a multi-valued feature into a 0/1 Vector, where each dimension
represents one of the values, i.e. 1 in one dimension => the feature contains
this value, 0 in one dimension => the feature does not contain this value.
|
| NDCGLoss |
The Normalized Discounted Cumulative Gain for ranking.
|
| NearestNeighborClassifier |
K-Nearest-Neighbor classifier for time series.
|
| NearestNeighborClassifier.VoteType |
Votes types that describe how to aggregate the prediciton for a test instance on its nearest neighbors found.
|
| NearestNeighborLearningAlgorithm |
Training algorithm for the nearest neighbors classifier.
|
| NegIdentityInpOptLoss |
Loss function for PLNet input optimization that maximizes the output of a PLNet.
|
| NoneFittedFilterExeception |
|
| NoProbingCharacterizer |
A Characterizer that applies several characterizers to a data set, but does
not use any probing.
|
| Normalizer |
|
| NumericAttributeType |
The numeric attribute type.
|
| NumericAttributeValue |
Numeric attribute value as it can be part of an instance.
|
| NumericFeatureDomain |
Description of a numeric feature domain.
|
| OneHotEncodingTransformer |
|
| OpenMLHelper |
|
| OSMAC<I extends ILabeledAttributeArrayInstance<?>,D extends IDataset<I>> |
|
| OSMACSamplingFactory<I extends ILabeledAttributeArrayInstance<?>,D extends IDataset<I>> |
|
| ParametricFunction |
This is a basic class that describes a function that can be parameterized with a set of parameters.
|
| PilotEstimateSampling<I extends ILabeledAttributeArrayInstance<?>,D extends IDataset<I>> |
|
| PLNetDyadRanker |
A dyad ranker based on a Plackett-Luce network.
|
| PLNetInputOptimizer |
Optimizes a given loss function ( InputOptimizerLoss) with respect to the input of a PLNet using gradient descent.
|
| PLNetLoss |
Implements the negative log likelihood (NLL) loss function for PL networks as described in [1]
|
| PointWiseLearningCurve |
This class represents a learning curve that gets returned by the
LCNet from pybnn
|
| PoolBasedUncertaintySamplingStrategy<T,I extends ILabeledInstance,D extends IDataset<I>> |
A simple pool-based uncertainty sampling strategy, which assesses certainty
for all instances in the pool and picks the instance with least certainty for
the next query.
|
| PPA |
|
| PrecisionAsLoss |
|
| PredictionException |
|
| PredictionFailedException |
|
| ProbabilisticMonteCarloCrossValidationEvaluator |
A classifier evaluator that can perform a (monte-carlo)cross-validation on
the given dataset.
|
| ProbabilisticMonteCarloCrossValidationEvaluatorFactory |
Factory for configuring probabilistic Monte Carlo cross-validation evaluators.
|
| ProblemInstance<I> |
|
| PrototypicalPoolBasedActiveDyadRanker |
A prototypical active dyad ranker based on the idea of uncertainty sampling.
|
| QuantileAggregator |
|
| RandomlyRankedNodeQueue<N,V extends java.lang.Comparable<V>> |
A node queue for the best first search that inserts new nodes at a random
position in the list.
|
| RandomlyRankedNodeQueueConfig<T> |
|
| RandomMultilabelCrossValidation |
Class executing pseudo-random splits to enable multilabelcrossvalidation.
|
| RandomPoolBasedActiveDyadRanker |
A random active dyad ranker.
|
| RandomSplitter |
|
| RangeQueryPredictor |
|
| Ranker<S,P> |
|
| Ranking<S> |
|
| RankingForGroup<C,S> |
RankingForGroup.java - saves a solution ranking for a group identified by thier group
|
| RankLoss |
|
| RankScore |
|
| ReductionGraphGenerator |
|
| ReductionOptimizer |
|
| ReproducibleInstances |
New Instances class to track splits and data origin.
|
| ReservoirSampling |
Implementation of the Reservoir Sampling algorithm(comparable to a Simple
Random Sampling for streamed data).
|
| RootMeanSquaredErrorLoss |
The root mean squared loss function.
|
| RPNDSplitter |
|
| RQPHelper |
|
| RQPHelper.IntervalAndHeader |
|
| SampleElementAddedEvent |
|
| SAX |
|
| ScalarDistanceUtil |
ScalarDistanceUtil
|
| ScikitLearnWrapper |
Wraps a Scikit-Learn Python process by utilizing a template to start a classifier in Scikit with the given classifier.
|
| ScikitLearnWrapper.ProblemType |
|
| SFA |
|
| Shapelet |
Implementation of a shapelet, i. e. a specific subsequence of a time series
representing a characteristic shape.
|
| ShapeletTransformLearningAlgorithm |
Algorithm training a ShapeletTransform classifier as described in Jason
Lines, Luke M.
|
| ShapeletTransformLearningAlgorithm.IShapeletTransformLearningAlgorithmConfig |
|
| ShapeletTransformTSClassifier |
Class for a ShapeletTransform classifier as described in Jason Lines, Luke M.
|
| ShotgunDistance |
Implementation of Shotgun Distance measure as published in "Towards Time
Series Classfication without Human Preprocessing" by Patrick Schäfer (2014).
|
| ShotgunEnsembleClassifier |
Implementation of Shotgun Ensemble Classifier as published in "Towards Time
Series Classfication without Human Preprocessing" by Patrick Schäfer (2014).
|
| ShotgunEnsembleLearnerAlgorithm |
Implementation of Shotgun Ensemble Algorihm as published in "Towards Time
Series Classfication without Human Preprocessing" by Patrick Schäfer (2014).
|
| ShotgunEnsembleLearnerAlgorithm.IShotgunEnsembleLearnerConfig |
|
| SimpleDataset<L> |
|
| SimpleInstance<L> |
|
| SimpleInstanceImpl |
|
| SimpleInstancesImpl |
|
| SimpleLabeledInstanceImpl |
|
| SimpleLabeledInstancesImpl |
|
| SimpleMLCSplitBasedClassifierEvaluator |
|
| SimpleRandomSampling<I,D extends IOrderedDataset<I>> |
|
| SimpleRandomSamplingFactory<I,D extends IOrderedDataset<I>> |
|
| SimpleSLCSplitBasedClassifierEvaluator |
|
| SimplifiedTimeSeriesLoader |
Time series loader class which provides functionality to read datasets from
files storing into simplified, more efficient time series datasets.
|
| SineTransform |
Calculates the sine transform of a time series.
|
| SingleRandomSplitClassifierEvaluator |
|
| SlidingWindowBuilder |
|
| SparseDyadRankingInstance |
A dyad ranking instance implementation that assumes the same instance for all
dyads contained in its ordering.
|
| SplitFailedException |
|
| SplitInstruction |
|
| SquaredBackwardDifferenceComplexity |
Complexity metric as described in "A Complexity-Invariant Distance Measure
for Time Series".
$$ c = sum_{i=1}^n-1 \sqrt{ (T_i - T_{i+1})^2 }$$
where $T_i$ are the values of the time series.
|
| StratifiedFileSampling |
|
| StratifiedSampling<I,D extends IOrderedDataset<I>> |
Implementation of Stratified Sampling: Divide dataset into strati and sample
from each of these.
|
| StratifiedSamplingFactory<I,D extends IOrderedDataset<I>> |
|
| StratifiedSplitSubsetInstruction |
Computes a two-fold split
|
| StretchingComplexity |
Stretching Complexity that calulates the length of a time series when
stretched to a straight line.
$$ c = sum_{i=1}^n-1 \sqrt{ (t_2 - t_1)^2 + (T_{i+1} - T_i)^2 }$$
where $t_i$ are the timestamps (here $t_i = i$) an $T_i$ are the values of
the time series.
|
| SubInstances |
|
| SystematicFileSampling |
File-level implementation of Systematic Sampling: Sort datapoints and pick
every k-th datapoint for the sample.
|
| SystematicSampling<I extends INumericArrayInstance,D extends IOrderedDataset<I>> |
Implementation of Systematic Sampling: Sort datapoints and pick every k-th
datapoint for the sample.
|
| SystematicSamplingFactory<I extends INumericArrayInstance,D extends IOrderedDataset<I>> |
|
| Table<I,S,P> |
Table.java - This class is used to store probleminstance and their according solutions and
performances for that solution.
|
| TimeoutableEvaluator |
|
| TimeSeriesAttributeType |
Describes a time series type as an 1-NDArray with a fixed length.
|
| TimeSeriesAttributeValue |
Represents a time series attribute value, as it can be part of a
jaicore.ml.core.dataset.IInstance
|
| TimeSeriesBagOfFeaturesClassifier |
Implementation of the Time Series Bag-of-Features (TSBF) classifier as
described in Baydogan, Mustafa & Runger, George & Tuv, Eugene. (2013).
|
| TimeSeriesBagOfFeaturesLearningAlgorithm |
Algorithm to train a Time Series Bag-of-Features (TSBF) classifier as
described in Baydogan, Mustafa & Runger, George & Tuv, Eugene. (2013).
|
| TimeSeriesBagOfFeaturesLearningAlgorithm.ITimeSeriesBagOfFeaturesConfig |
|
| TimeSeriesBatchLoader |
BatchLoader
|
| TimeSeriesDataset<L> |
Time Series Dataset.
|
| TimeSeriesDataset |
Dataset for time series.
|
| TimeSeriesFeature |
Class calculating features (e. g. mean, stddev or slope) on given
subsequences of time series.
|
| TimeSeriesFeature.FeatureType |
Feature types used within the time series tree.
|
| TimeSeriesForestClassifier |
Time series forest classifier as described in Deng, Houtao et al.
|
| TimeSeriesForestLearningAlgorithm |
Algorithm to train a time series forest classifier as described in Deng,
Houtao et al.
|
| TimeSeriesForestLearningAlgorithm.ITimeSeriesForestConfig |
|
| TimeSeriesInstance<L> |
TimeSeriesInstance
|
| TimeSeriesLengthException |
Exception class encapsultes faulty behaviour with length of time series.
|
| TimeSeriesLoadingException |
Exception thrown when a time series dataset could not be extracted from an
external data source (e. g. a file).
|
| TimeSeriesTreeClassifier |
Time series tree as described in Deng, Houtao et al.
|
| TimeSeriesTreeLearningAlgorithm |
Algorithm to build a time series tree as described in Deng, Houtao et al.
|
| TimeSeriesTreeLearningAlgorithm.ITimeSeriesTreeConfig |
|
| TimeSeriesUtil |
Utility class for time series operations.
|
| TimeWarpEditDistance |
Time Warp Edit Distance as published in "Time Warp Edit Distance with
Stiffness Adjustment for Time Series Matching" by Pierre-Francois Marteau
(2009).
|
| TopKOfPredicted |
Calculates if the top-k dyads of the predicted ranking match the top-k dyads
of the actual ranking.
|
| TrainingException |
|
| TransformDistance |
Implementation of the Transform Distance (TD) measure as published in
"Non-isometric transforms in time series classification using DTW" by Tomasz
Gorecki and Maciej Luczak (2014).
|
| TSClassifier<L,V,D extends TimeSeriesDataset<L>> |
Time series classifier which can be trained and used as a predictor.
|
| TSLearningProblem |
|
| UCBPoolBasedActiveDyadRanker |
A prototypical active dyad ranker based on the UCB decision rule.
|
| UncheckedJaicoreMLException |
|
| WaitForSamplingStepEvent |
|
| WeightedDynamicTimeWarping |
Implementation of the Dynamic Time Warping (DTW) measure as published in
"Weighted dynamic time warping for time series classification" by Young-Seon
Jeong, Myong K.
|
| WekaCompatibleInstancesImpl |
|
| WekaInstance<L> |
|
| WekaInstances<L> |
|
| WekaInstancesFeatureUnion |
|
| WekaInstancesUtil |
|
| WekaUtil |
WekaUtil
|
| WekaUtil |
|
| ZeroOneLoss |
|
| ZeroShotUtil |
A collection of utility methods used to map the results of a input optimization of PLNetInputOptimizer back to Weka options for the respective classifiers.
|
| ZTransformer |
|