MST-GEN: An Efficient Parameter Selection Method for One-Class Extreme Learning Machine

IEEE Trans Cybern. 2017 Oct;47(10):3266-3279. doi: 10.1109/TCYB.2017.2707463. Epub 2017 Jun 5.

Abstract

One-class classification (OCC) models a set of target data from one class to detect outliers. OCC approaches like one-class support vector machine (OCSVM) and support vector data description (SVDD) have wide practical applications. Recently, one-class extreme learning machine (OCELM), which inherits the fast learning speed of original ELM and achieves equivalent or higher data description performance than OCSVM and SVDD, is proposed as a promising alternative. However, OCELM faces the same thorny parameter selection problem as OCSVM and SVDD. It significantly affects the performance of OCELM and remains under-explored. This paper proposes minimal spanning tree (MST)-GEN, an automatic way to select proper parameters for OCELM. Specifically, we first build a n -round MST to model the structure and distribution of the given target set. With information from n -round MST, a controllable number of pseudo outliers are generated by edge pattern detection and a novel "repelling" process, which readily overcomes two fundamental problems in previous outlier generation methods: where and how many pseudo outliers should be generated. Unlike previous methods that only generate pseudo outliers, we further exploit n -round MST to generate pseudo target data, so as to avoid the time-consuming cross-validation process and accelerate the parameter selection. Extensive experiments on various datasets suggest that the proposed method can select parameters for OCELM in a highly efficient and accurate manner when compared with existing methods, which enables OCELM to achieve better OCC performance in OCC applications. Furthermore, our experiments show that MST-GEN can also be favorably applied to other prevalent OCC methods like OCSVM and SVDD.