next up previous contents index
Next: Note for programmers Up: Classifiers Previous: Available classifiers   Contents   Index


Combining one-class classifiers

Many real-world datasets have a much more complicated distribution than can be modeled by, say, a mixture of Gaussians. It appears that it might be very benificial to combine classifiers. Each of the classifiers can focus on a specific feature or characteristic in the data. By combining the classifiers, one hopes to combine all the strong points of the classifiers, and obtain a much more flexible model.

Like with normal classifiers, there is the problem that the outputs of the classifiers should be rescaled in such a way, that the outputs become comparable. For trainable combining this is not very essential, but when fixed combination rules like mean-rule, max-rule, median-rule are considered, the outputs of the classifiers should be rescaled.

In Prtools, the output of many classifiers are scaled by fitting a sigmoid function and then normalized such that the sum of the outputs becomes 1. In this way, the outputs of the classifier can be interpreted as (an approximation to) the class posterior probabilities. This scaling is done by the function classc (see also prex_combining.m).

For one-class classifiers there is a small complication. Here there are two types of classifiers: classifiers based on density estimations, and classifiers based on distances to a model. Normalization of the first type of classifiers is no problem, it directly follows the strategy of Prtools. The output of the second type of classifiers, on the other hand, causes problems. Imagine an object belonging to the target class. It will have a distance to the target model smaller than some threshold. According to Prtools, the objects are assigned to the class with the highest output. Therefore, in dd_tools, the distances of the distance-based classifiers are negated such that the output for the target class with be higher than the threshold.

To normalize the outputs of these distance-based classifiers, the output of these classifiers have to be negated again. This means that the standard classc cannot be applied. For this, a new function dd_normc is introduced. The standard approach is to multiply all classifiers per default with dd_normc. It will not change the classification by the classifier.

  >> a = target_class(gendatb);
  >> w1 = gauss_dd(a,0.1);    % define 4 arbitrary OC classifiers
  >> w2 = pca_dd(a,0.1,1);
  >> w3 = kmeans_dd(a,0.1,3);
  >> w4 = mog_dd(a,0.1,2);
  >>                          % combine them with the mean comb rule:
  >> W = [w1*dd_normc w2*dd_normc w3*dd_normc w4*dd_normc] * meanc;
  >> scatterd(x);
  >> plotc(W);


next up previous contents index
Next: Note for programmers Up: Classifiers Previous: Available classifiers   Contents   Index
David M.J. Tax 2006-07-26