Class AMonteCarloCrossValidationBasedEvaluatorFactory<F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>
- java.lang.Object
-
- ai.libs.jaicore.ml.core.evaluation.evaluator.factory.AMonteCarloCrossValidationBasedEvaluatorFactory<F>
-
- All Implemented Interfaces:
ISplitBasedSupervisedLearnerEvaluatorFactory<org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance,org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>,F>,ISupervisedLearnerEvaluatorFactory<org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance,org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>>,org.api4.java.ai.ml.core.evaluation.IPredictionPerformanceMetricConfigurable,org.api4.java.ai.ml.core.IDataConfigurable<org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>>,org.api4.java.common.control.IRandomConfigurable
- Direct Known Subclasses:
MonteCarloCrossValidationEvaluatorFactory
public abstract class AMonteCarloCrossValidationBasedEvaluatorFactory<F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>> extends java.lang.Object implements ISplitBasedSupervisedLearnerEvaluatorFactory<org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance,org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>,F>, org.api4.java.common.control.IRandomConfigurable, org.api4.java.ai.ml.core.IDataConfigurable<org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>>, org.api4.java.ai.ml.core.evaluation.IPredictionPerformanceMetricConfigurable
An abstract factory for configuring Monte Carlo cross-validation based evaluators.
-
-
Field Summary
Fields Modifier and Type Field Description protected org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>dataprotected org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?>metricprotected intnumMCIterationsprotected java.util.Randomrandom
-
Constructor Summary
Constructors Modifier Constructor Description protectedAMonteCarloCrossValidationBasedEvaluatorFactory()Standard c'tor.
-
Method Summary
All Methods Instance Methods Abstract Methods Concrete Methods Modifier and Type Method Description booleangetCacheSplitSets()org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>getData()Getter for the dataset which is used for splitting.org.api4.java.ai.ml.core.dataset.splitter.IDatasetSplitter<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>>getDatasetSplitter()Getter for the dataset splitter.org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?>getMeasure()org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?>getMetric()intgetNumMCIterations()Getter for the number of iterations, i.e. the number of splits considered.abstract FgetSelf()intgetTimeoutForSolutionEvaluation()Getter for the timeout for evaluating a solution.doublegetTrainFoldSize()Getter for the size of the train fold.voidsetData(org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance> data)voidsetMeasure(org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?> measure)voidsetRandom(java.util.Random random)FwithCacheSplitSets(boolean cacheSplitSets)FwithData(org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> data)Configures the dataset which is split into train and test data.FwithDatasetSplitter(org.api4.java.ai.ml.core.dataset.splitter.IDatasetSplitter<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>> datasetSplitter)Configures the evaluator to use the given dataset splitter.FwithMeasure(org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?> measure)FwithNumMCIterations(int numMCIterations)Configures the number of monte carlo cross-validation iterations.FwithRandom(java.util.Random random)FwithTimeoutForSolutionEvaluation(int timeoutForSolutionEvaluation)Configures a timeout for evaluating a solution.FwithTrainFoldSize(double trainFoldSize)Configures the portion of the training data relative to the entire dataset size.-
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface ai.libs.jaicore.ml.core.evaluation.evaluator.factory.ISupervisedLearnerEvaluatorFactory
getLearnerEvaluator
-
-
-
-
Field Detail
-
random
protected java.util.Random random
-
numMCIterations
protected int numMCIterations
-
data
protected org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> data
-
metric
protected org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?> metric
-
-
Method Detail
-
getDatasetSplitter
public org.api4.java.ai.ml.core.dataset.splitter.IDatasetSplitter<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>> getDatasetSplitter()
Getter for the dataset splitter.- Specified by:
getDatasetSplitterin interfaceISplitBasedSupervisedLearnerEvaluatorFactory<org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance,org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>,F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>- Returns:
- Returns the dataset spliiter.
-
getNumMCIterations
public int getNumMCIterations()
Getter for the number of iterations, i.e. the number of splits considered.- Returns:
- The number of iterations.
-
getData
public org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> getData()
Getter for the dataset which is used for splitting.- Specified by:
getDatain interfaceorg.api4.java.ai.ml.core.IDataConfigurable<F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>- Returns:
- The original dataset that is being split.
-
getTrainFoldSize
public double getTrainFoldSize()
Getter for the size of the train fold.- Returns:
- The portion of the training data.
-
getTimeoutForSolutionEvaluation
public int getTimeoutForSolutionEvaluation()
Getter for the timeout for evaluating a solution.- Returns:
- The timeout for evaluating a solution.
-
getMetric
public org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?> getMetric()
-
withDatasetSplitter
public F withDatasetSplitter(org.api4.java.ai.ml.core.dataset.splitter.IDatasetSplitter<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>> datasetSplitter)
Configures the evaluator to use the given dataset splitter.- Specified by:
withDatasetSplitterin interfaceISplitBasedSupervisedLearnerEvaluatorFactory<org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance,org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>,F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>- Parameters:
datasetSplitter- The dataset splitter to be used.- Returns:
- The factory object.
-
withRandom
public F withRandom(java.util.Random random)
-
withNumMCIterations
public F withNumMCIterations(int numMCIterations)
Configures the number of monte carlo cross-validation iterations.- Parameters:
numMCIterations- The number of iterations to run.- Returns:
- The factory object.
-
withData
public F withData(org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> data)
Configures the dataset which is split into train and test data.- Parameters:
data- The dataset to be split.- Returns:
- The factory object.
-
withTrainFoldSize
public F withTrainFoldSize(double trainFoldSize)
Configures the portion of the training data relative to the entire dataset size.- Parameters:
trainFoldSize- The size of the training fold (0,1).- Returns:
- The factory object.
-
withTimeoutForSolutionEvaluation
public F withTimeoutForSolutionEvaluation(int timeoutForSolutionEvaluation)
Configures a timeout for evaluating a solution.- Parameters:
timeoutForSolutionEvaluation- The timeout for evaluating a solution.- Returns:
- The factory object.
-
getSelf
public abstract F getSelf()
-
setMeasure
public void setMeasure(org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?> measure)
- Specified by:
setMeasurein interfaceorg.api4.java.ai.ml.core.evaluation.IPredictionPerformanceMetricConfigurable
-
setData
public void setData(org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance> data)
- Specified by:
setDatain interfaceorg.api4.java.ai.ml.core.IDataConfigurable<F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>
-
setRandom
public void setRandom(java.util.Random random)
- Specified by:
setRandomin interfaceorg.api4.java.common.control.IRandomConfigurable
-
withMeasure
public F withMeasure(org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?> measure)
-
getMeasure
public org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,?> getMeasure()
-
withCacheSplitSets
public F withCacheSplitSets(boolean cacheSplitSets)
-
getCacheSplitSets
public boolean getCacheSplitSets()
-
-