object UncertaintyCorrelation extends Merit[Double] with Product with Serializable
Measure of the correlation between the predicted uncertainty and error magnitude
This is expressed as a ratio of correlation coefficients. The numerator is the correlation coefficient of the predicted uncertainty and the actual error magnitude. The denominator is the correlation coefficient of the predicted uncertainty and the ideal error distribution. That is: let X be the predicted uncertainty and Y := N(0, x) be the ideal error distribution about each predicted uncertainty x. It is the correlation coefficient between X and Y In the absence of a closed form for that coefficient, it is model empirically by drawing from N(0, x) to produce an "ideal" error series from which the correlation coefficient can be estimated.
- Alphabetic
- By Inheritance
- UncertaintyCorrelation
- Serializable
- Serializable
- Product
- Equals
- Merit
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
-
def
computeFromPredictedUncertaintyActual(pua: Seq[(Double, Double, Double)]): Double
Covariance(X, Y) / Sqrt(Var(X) * Var(Y)), where X is predicted uncertainty and Y is magnitude of error
Covariance(X, Y) / Sqrt(Var(X) * Var(Y)), where X is predicted uncertainty and Y is magnitude of error
- pua
predicted, uncertainty, and actual
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
estimate(pva: Iterable[(PredictionResult[Double], Seq[Double])]): (Double, Double)
Estimate the merit and the uncertainty in the merit over batches of predicted and ground-truth values
Estimate the merit and the uncertainty in the merit over batches of predicted and ground-truth values
- pva
predicted-vs-actual data as an iterable over PredictionResult and ground-truth tuples
- returns
the estimate of the merit value and the uncertainty in that estimate
- Definition Classes
- Merit
-
def
evaluate(predictionResult: PredictionResult[Double], actual: Seq[Double]): Double
Apply the figure of merti to a prediction result and set of ground-truth values
Apply the figure of merti to a prediction result and set of ground-truth values
- returns
the value of the figure of merit
- Definition Classes
- UncertaintyCorrelation → Merit
-
def
finalize(): Unit
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )