KDD Papers

Fast Newton Hard Thresholding Pursuit for Sparsity Constrained Nonconvex Optimization

Jinghui Chen (University of Virginia);Quanquan Gu (University of Virginia)


Abstract

We propose a fast Newton hard thresholding pursuit algorithm for sparsity constrained nonconvex optimization. Our proposed algorithm reduces the per-iteration time complexity to be linear in the data dimension $d$ compared with cubic time complexity in Newton’s method, while preserving faster computational and statistical convergence rates. In particular, we prove that the proposed algorithm converges to the unknown true model parameter at a composite rate, namely quadratic at first and linear when it gets close to the true parameter, up to the minimax optimal statistical precision of the underlying model. Thorough experiments on both synthetic and real datasets demonstrate that our algorithm outperforms the state-of-the-art optimization algorithms for sparsity constrained optimization.


Comments