public class KSTest
extends java.lang.Object
The two-sample KS test is one of the most useful and general non-parametric methods for comparing two samples, as it is sensitive to differences in both location and shape of the empirical cumulative distribution functions of the two samples.
The Kolmogorov-Smirnov test can be modified to serve goodness of fit test. In the special case of testing for normality of the distribution, samples are standardized and compared with a standard normal distribution. This is equivalent to setting the mean and variance of the reference distribution equal to the sample estimates, and it is known that using the sample to modify the null hypothesis reduces the power of a test. Correcting for this bias leads to the Lilliefors test. However, even Lilliefors' modification is less powerful than the Shapiro-Wilk test or Anderson-Darling test for testing normality.
| Modifier and Type | Field and Description |
|---|---|
double |
d
Kolmogorov-Smirnov statistic
|
double |
pvalue
P-value
|
| Modifier and Type | Method and Description |
|---|---|
static KSTest |
test(double[] x,
Distribution dist)
The one-sample KS test for the null hypothesis that the data set x
is drawn from the given distribution.
|
static KSTest |
test(double[] x,
double[] y)
The two-sample KS test for the null hypothesis that the data sets
are drawn from the same distribution.
|
public static KSTest test(double[] x, Distribution dist)
public static KSTest test(double[] x, double[] y)