Format

Send to

Choose Destination
See comment in PubMed Commons below
IEEE Trans Pattern Anal Mach Intell. 2013 May;35(5):1025-38. doi: 10.1109/TPAMI.2012.189.

A convex formulation for learning a shared predictive structure from multiple tasks.

Author information

  • 1GE Global Research, 2623 Camino Ramon, San Ramon, CA 94583, USA. jchen@ge.com

Abstract

In this paper, we consider the problem of learning from multiple related tasks for improved generalization performance by extracting their shared structures. The alternating structure optimization (ASO) algorithm, which couples all tasks using a shared feature representation, has been successfully applied in various multitask learning problems. However, ASO is nonconvex and the alternating algorithm only finds a local solution. We first present an improved ASO formulation (iASO) for multitask learning based on a new regularizer. We then convert iASO, a nonconvex formulation, into a relaxed convex one (rASO). Interestingly, our theoretical analysis reveals that rASO finds a globally optimal solution to its nonconvex counterpart iASO under certain conditions. rASO can be equivalently reformulated as a semidefinite program (SDP), which is, however, not scalable to large datasets. We propose to employ the block coordinate descent (BCD) method and the accelerated projected gradient (APG) algorithm separately to find the globally optimal solution to rASO; we also develop efficient algorithms for solving the key subproblems involved in BCD and APG. The experiments on the Yahoo webpages datasets and the Drosophila gene expression pattern images datasets demonstrate the effectiveness and efficiency of the proposed algorithms and confirm our theoretical analysis.

PMID:
23520249
PMCID:
PMC3784327
DOI:
10.1109/TPAMI.2012.189
PubMed Commons home

PubMed Commons

0 comments
How to join PubMed Commons

    Supplemental Content

    Full text links

    Icon for IEEE Engineering in Medicine and Biology Society Icon for PubMed Central
    Loading ...
    Support Center