Format

Send to

Choose Destination
IEEE Trans Neural Netw Learn Syst. 2012 Nov;23(11):1767-78. doi: 10.1109/TNNLS.2012.2214057.

Boosted network classifiers for local feature selection.

Abstract

Like all models, network feature selection models require that assumptions be made on the size and structure of the desired features. The most common assumption is sparsity, where only a small section of the entire network is thought to produce a specific phenomenon. The sparsity assumption is enforced through regularized models, such as the lasso. However, assuming sparsity may be inappropriate for many real-world networks, which possess highly correlated modules. In this paper, we illustrate two novel optimization strategies, namely, boosted expectation propagation (BEP) and boosted message passing (BMP), which directly use the network structure to estimate the parameters of a network classifier. BEP and BMP are ensemble methods that seek to optimize classification performance by combining individual models built upon local network features. Neither BEP nor BMP assumes a sparse solution, but instead they seek a weighted average of all network features where the weights are used to emphasize all features that are useful for classification. In this paper, we compare BEP and BMP with network-regularized logistic regression models on simulated and real biological networks. The results show that, where highly correlated network structure exists, assuming sparsity adversely effects the accuracy and feature selection power of the network classifier.

PMID:
24808071
DOI:
10.1109/TNNLS.2012.2214057
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for IEEE Engineering in Medicine and Biology Society
Loading ...
Support Center