In order to find a good one-class classifier, two types of errors have to be minimized, namely the fraction false positives and the fraction false negatives. In table 2.1 all possible classification situations for one-class classification are shown.
The fraction false negative can be estimated using (for instance) cross-validation on the target training set. Unfortunately, the fraction false negative is much harder to estimate. When no example outlier objects are available, this fraction cannot be estimated. Minimizing just the fraction false negative, will result in a classifier which labels all object as target object. In order to avoid this degenerate solution, outlier examples have to be available, or artificial outliers have to be generated (see also section 5.5).