The one-class classifiers should be trained on the datasets from the previous chapter. Many one-class classifiers do not know how to use example outliers in their training data. They may therefore complain, or just ignore the outlier objects completely if you supply them in your training data. For now, I call it the responsibility of the user...
All one-class classifiers share the same characteristics:
An example of a one-class classifier is for instance:
>> x = target_class(gendatb([20 0]),'1'); >> w = gauss_dd(x,0.1)This trains a classifier gauss_dd on data x (this particular classifier just estimates a Gaussian density on the target class). A threshold is put such that of the training target objects will be rejected and classified as outlier. So the fraction false negative will be . (Note that this is optimized on the training data. This means that the performance on an independent test set might deviate significantly!) After this rejection threshold, other parameters can be given (for instance, for the -means clustering method, it is the number of clusters k).
These one-class classifiers are normal mappings in the Prtools sense. So they can be plotted by plotc, can be combined with other mappings by , *, etc. To check if a classifier is a one-class classifier (i.e. it labels objects as target or outlier), use isocc.
>> x=oc_set(gendatb([50,10]),'1') >> scatterd(x) >> w = svdd(target_class(x),0.1,8); >> plotc(w) >> w = svdd(x,0.1,8); >> plotc(w)