Class AMonteCarloCrossValidationBasedEvaluatorFactory<F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>

  • All Implemented Interfaces:
    ISupervisedLearnerEvaluatorFactory<org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance,​org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>>, org.api4.java.ai.ml.core.evaluation.IPredictionPerformanceMetricConfigurable, org.api4.java.ai.ml.core.IDataConfigurable<org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>>, org.api4.java.common.control.IRandomConfigurable
    Direct Known Subclasses:
    MonteCarloCrossValidationEvaluatorFactory

    public abstract class AMonteCarloCrossValidationBasedEvaluatorFactory<F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>
    extends java.lang.Object
    implements ISupervisedLearnerEvaluatorFactory<org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance,​org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>>, org.api4.java.common.control.IRandomConfigurable, org.api4.java.ai.ml.core.IDataConfigurable<org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance>>, org.api4.java.ai.ml.core.evaluation.IPredictionPerformanceMetricConfigurable
    An abstract factory for configuring Monte Carlo cross-validation based evaluators.
    • Field Summary

      Fields 
      Modifier and Type Field Description
      protected org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> data  
      protected org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,​?> metric  
      protected int numMCIterations  
      protected java.util.Random random  
    • Method Summary

      All Methods Instance Methods Abstract Methods Concrete Methods 
      Modifier and Type Method Description
      org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> getData()
      Getter for the dataset which is used for splitting.
      org.api4.java.ai.ml.core.dataset.splitter.IDatasetSplitter<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>> getDatasetSplitter()
      Getter for the dataset splitter.
      int getNumMCIterations()
      Getter for the number of iterations, i.e. the number of splits considered.
      abstract F getSelf()  
      int getTimeoutForSolutionEvaluation()
      Getter for the timeout for evaluating a solution.
      double getTrainFoldSize()
      Getter for the size of the train fold.
      void setData​(org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance> data)  
      void setMeasure​(org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,​?> measure)  
      void setRandom​(java.util.Random random)  
      F withData​(org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> data)
      Configures the dataset which is split into train and test data.
      F withDatasetSplitter​(org.api4.java.ai.ml.core.dataset.splitter.IDatasetSplitter<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>> datasetSplitter)
      Configures the evaluator to use the given dataset splitter.
      F withMeasure​(org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,​?> measure)  
      F withNumMCIterations​(int numMCIterations)
      Configures the number of monte carlo cross-validation iterations.
      F withRandom​(java.util.Random random)  
      F withTimeoutForSolutionEvaluation​(int timeoutForSolutionEvaluation)
      Configures a timeout for evaluating a solution.
      F withTrainFoldSize​(double trainFoldSize)
      Configures the portion of the training data relative to the entire dataset size.
      • Methods inherited from class java.lang.Object

        clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
    • Field Detail

      • random

        protected java.util.Random random
      • numMCIterations

        protected int numMCIterations
      • data

        protected org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> data
      • metric

        protected org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,​?> metric
    • Constructor Detail

      • AMonteCarloCrossValidationBasedEvaluatorFactory

        protected AMonteCarloCrossValidationBasedEvaluatorFactory()
        Standard c'tor.
    • Method Detail

      • getDatasetSplitter

        public org.api4.java.ai.ml.core.dataset.splitter.IDatasetSplitter<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>> getDatasetSplitter()
        Getter for the dataset splitter.
        Returns:
        Returns the dataset spliiter.
      • getNumMCIterations

        public int getNumMCIterations()
        Getter for the number of iterations, i.e. the number of splits considered.
        Returns:
        The number of iterations.
      • getData

        public org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> getData()
        Getter for the dataset which is used for splitting.
        Specified by:
        getData in interface org.api4.java.ai.ml.core.IDataConfigurable<F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>
        Returns:
        The original dataset that is being split.
      • getTrainFoldSize

        public double getTrainFoldSize()
        Getter for the size of the train fold.
        Returns:
        The portion of the training data.
      • getTimeoutForSolutionEvaluation

        public int getTimeoutForSolutionEvaluation()
        Getter for the timeout for evaluating a solution.
        Returns:
        The timeout for evaluating a solution.
      • withDatasetSplitter

        public F withDatasetSplitter​(org.api4.java.ai.ml.core.dataset.splitter.IDatasetSplitter<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?>> datasetSplitter)
        Configures the evaluator to use the given dataset splitter.
        Parameters:
        datasetSplitter - The dataset splitter to be used.
        Returns:
        The factory object.
      • withRandom

        public F withRandom​(java.util.Random random)
      • withNumMCIterations

        public F withNumMCIterations​(int numMCIterations)
        Configures the number of monte carlo cross-validation iterations.
        Parameters:
        numMCIterations - The number of iterations to run.
        Returns:
        The factory object.
      • withData

        public F withData​(org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<?> data)
        Configures the dataset which is split into train and test data.
        Parameters:
        data - The dataset to be split.
        Returns:
        The factory object.
      • withTrainFoldSize

        public F withTrainFoldSize​(double trainFoldSize)
        Configures the portion of the training data relative to the entire dataset size.
        Parameters:
        trainFoldSize - The size of the training fold (0,1).
        Returns:
        The factory object.
      • withTimeoutForSolutionEvaluation

        public F withTimeoutForSolutionEvaluation​(int timeoutForSolutionEvaluation)
        Configures a timeout for evaluating a solution.
        Parameters:
        timeoutForSolutionEvaluation - The timeout for evaluating a solution.
        Returns:
        The factory object.
      • getSelf

        public abstract F getSelf()
      • setMeasure

        public void setMeasure​(org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,​?> measure)
        Specified by:
        setMeasure in interface org.api4.java.ai.ml.core.evaluation.IPredictionPerformanceMetricConfigurable
      • setData

        public void setData​(org.api4.java.ai.ml.core.dataset.supervised.ILabeledDataset<? extends org.api4.java.ai.ml.core.dataset.supervised.ILabeledInstance> data)
        Specified by:
        setData in interface org.api4.java.ai.ml.core.IDataConfigurable<F extends AMonteCarloCrossValidationBasedEvaluatorFactory<F>>
      • setRandom

        public void setRandom​(java.util.Random random)
        Specified by:
        setRandom in interface org.api4.java.common.control.IRandomConfigurable
      • withMeasure

        public F withMeasure​(org.api4.java.ai.ml.core.evaluation.supervised.loss.IDeterministicPredictionPerformanceMeasure<?,​?> measure)