摘要
Unlike traditional LASSO enforcing sparsity on the variables, Generalized LASSO (GL) enforces sparsity on a linear transformation of the variables, gaining flexibility and success in many applications. However, many existing GL algorithms do not scale up to high-dimensional problems, and/or only work well for a specific choice of the transformation. We propose an efficient Matching Pursuit Generalized LASSO (MPGL) method, which overcomes these issues, and is guaranteed to converge to a global optimum. We formulate the GL problem as a convex quadratic constrained linear programming (QCLP) problem and tailor-make a cutting plane method. More specifically, our MPGL iteratively activates a subset of nonzero elements of the transformed variables, and solves a subproblem involving only the activated elements thus gaining significant speed-up. Moreover, MPGL is less sensitive to the choice of the trade-off hyper-parameter between data fitting and regularization, and mitigates the longstanding hyper-parameter tuning issue in many existing methods. Experiments demonstrate the superior efficiency and accuracy of the proposed method over the state-of-the-arts in both classification and image processing tasks.
源语言 | 英语 |
---|---|
页 | 1934-1940 |
页数 | 7 |
出版状态 | 已出版 - 2017 |
活动 | 31st AAAI Conference on Artificial Intelligence, AAAI 2017 - San Francisco, 美国 期限: 4 2月 2017 → 10 2月 2017 |
会议
会议 | 31st AAAI Conference on Artificial Intelligence, AAAI 2017 |
---|---|
国家/地区 | 美国 |
市 | San Francisco |
时期 | 4/02/17 → 10/02/17 |