ISSN : 1796-203X
Volume : 1    Issue : 7    Date : October/November 2006

Parameter Optimization of Kernel-based One-class Classifier on Imbalance Learning
Ling Zhuang and Honghua Dai
Page(s): 32-40
Full Text:
PDF (342 KB)

Compared with conventional two-class learning schemes, one-class classification simply uses a
single class in the classifier training phase. Applying one-class classification to learn from
unbalanced data set is regarded as the recognition based learning and has shown to have the
potential of achieving better performance. Similar to twoclass learning, parameter selection is a
significant issue, especially when the classifier is sensitive to the parameters. For one-class
learning scheme with the kernel function, such as one-class Support Vector Machine and Support
Vector Data Description, besides the parameters involved in the kernel, there is another one-class
specific parameter: the rejection rate v. In this paper, we proposed a general framework to involve
the majority class in solving the parameter selection problem. In this framework, we first use the
minority target class for training in the one-class classification stage; then we use both minority and
majority class for estimating the generalization performance of the constructed classifier. This
generalization performance is set as the optimization criteria. We employed the Grid search and
Experiment Design search to attain various parameter settings. Experiments on UCI and Reuters
text data show that the parameter optimized one-class classifiers outperform all the standard
one-class learning schemes we examined.

Index Terms
one-class classification framework, imbalance learning, one-class Support Vector Machine,
Support Vector Data Description(SVDD)