In research, one often wants to see what the values of the optimized parameters are. For each type of classifier a data structure is saved. This can be retrieved by:
>> W = +w; % possibility 1 >> W = w.data; % possibility 2This W now contains several sub-fields with the optimized parameters. Which parameters are stored, depends on the classifier. The format is free, except that one parameter, threshold, should always be there.
An example is the support vector data description. In a quadratic optimization procedure, the weights are optimized. Assume, I want to have of the data on the boundary, using a Gaussian kernel with a kernel parameter (forget the details, they are not important). Now I'm interested in what the optimal 's will be:
>> x = target_class(gendatb([50,0]),'1'); >> w = svdd(x,0.1,5); >> W = +w; >> W.aNote The SVDD has been changed in this new version of the toolbox. Have a look at the remarks at the end of the file (chapter 6).
Another example is the Mixture of Gaussians. Let us see if we can plot the boundary and the centers of the clusters. First, we create some data and train the classifier (using 5 clusters):
>> x = target_class(gendatb([100,0]),'1'); >> w = mog_dd(x,0.1,5);Now we inspect the trained classifier and give some visual feedback:
>> W = +w >> scatterd(x); >> plotc(w); hold on; >> scatterd(W.m,'r*')Apparently, the means of the clusters are stored in the m field of the structure in the classifier.