Format

Send to

Choose Destination
IEEE Trans Neural Netw. 2008 Nov;19(11):1854-72. doi: 10.1109/TNN.2008.2003299.

Combining DC algorithms (DCAs) and decomposition techniques for the training of nonpositive-semidefinite kernels.

Author information

1
Planning Department, AES Sonel, Douala, Littoral 4077, Cameroon. francois.akoa@aes.com

Abstract

Today, decomposition methods are one of the most popular methods for training support vector machines (SVMs). With the use of kernels that do not satisfy Mercer's condition, new techniques must be designed to handle nonpositive-semidefinite kernels resulting to this choice. In this work we incorporate difference of convex (DC functions) optimization techniques into decomposition methods to tackle this difficulty. The new approach needs no problem modification and we show that the only use of a truncated DC algorithms (DCAs) in the decomposition scheme produces a sufficient decrease of the objective function at each iteration. Thanks to this property, an asymptotic convergence proof of the new algorithm is produced without any blockwise convexity assumption on the objective function. We also investigate a working set selection rule using second-order information for sequential minimal optimization (SMO)-type decomposition in the spirit of DC optimization. Numerical results show the robustness and the efficiency of the new methods compared with state-of-the-art software.

PMID:
18990641
DOI:
10.1109/TNN.2008.2003299
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for IEEE Engineering in Medicine and Biology Society
Loading ...
Support Center