我校偷情自拍 石跃勇老师在T1级别期刊——《IEEE transactions on neural networks and learning systems》上发表题为“Newton-Raphson Meets Sparsity: Sparse Learning Via a Novel Penalty and a Fast Solver”。论文作者石跃勇为偷情自拍 副教授。
Abstract / 摘要:
In machine learning and statistics, the penalized regression methods are the main tools for variable selection (or feature selection) in high-dimensional sparse data analysis. Due to the nonsmoothness of the associated thresholding operators of commonly used penalties such as the least absolute shrinkage and selection operator (LASSO), the smoothly clipped absolute deviation (SCAD), and the minimax concave penalty (MCP), the classical Newton-Raphson algorithm cannot be used. In this article, we propose a cubic Hermite interpolation penalty (CHIP) with a smoothing thresholding operator. Theoretically, we establish the nonasymptotic estimation error bounds for the global minimizer of the CHIP penalized high-dimensional linear regression. Moreover, we show that the estimated support coincides with the target support with a high probability. We derive the Karush-Kuhn-Tucker (KKT) condition for the CHIP penalized estimator and then develop a support detection-based Newton-Raphson (SDNR) algorithm to solve it. Simulation studies demonstrate that the proposed method performs well in a wide range of finite sample situations. We also illustrate the application of our method with a real data example.
论文信息;
Title/题目:
Newton-Raphson Meets Sparsity: Sparse Learning Via a Novel Penalty and a Fast Solver
Authors/作者:
Cao Yongxiu; Kang Lican; Li Xuerui; Liu Yanyan; Luo Yuan; Shi Yueyong
Keyword / 关键词
Computing and Processing; Communication, Networking and Broadcast Technologies; Components, Circuits, Devices and Systems; General Topics for Engineers; Interpolation; Smoothing methods; Linear regression; Computational modeling;
Indexed by / 核心评价
EI;INSPEC;MEDLINE;SCI;Scopus;WAJCI
DOI:10.1109/TNNLS.2023.3251748
全文链接:
//libproxy.tqzp.org/https/443/org/ieee/ieeexplore/yitlink/document/10064704