Sangin LEE - Academia.edu (original) (raw)

Sangin LEE

Uploads

Papers by Sangin LEE

Research paper thumbnail of Efficient Algorithms

rules for nonconvex penalties and their implications for

Research paper thumbnail of The Mnet method for variable selection

Statistica Sinica, 2016

We propose a penalized approach for variable selection using a combination of minimax concave and... more We propose a penalized approach for variable selection using a combination of minimax concave and ridge penalties. The method is designed to deal with p ≥ n problems with highly correlated predictors. We call this the Mnet method. Similar to the elastic net of Zou and Hastie (2005), the Mnet tends to select or drop highly correlated predictors together. However, unlike the elastic net, the Mnet is selection consistent and equal to the oracle ridge estimator with high probability under reasonable conditions. We develop an efficient coordinate descent algorithm to compute the Mnet estimates. Simulation studies show that the Mnet has better performance in the presence of highly correlated predictors than either the elastic net or MCP. We illustrate the application of the Mnet to data from a gene expression study in ophthalmology.

Research paper thumbnail of Balancing stability and bias reduction in variable selection with the mnet estimator

We propose a new penalized approach for variable selection using a combination of minimax concave... more We propose a new penalized approach for variable selection using a combination of minimax concave and ridge penalties. The proposed method is designed to deal with p ≥ n problems with highly correlated predictors. We call the propose approach the Mnet method. Similar to the elastic net of Zou and Hastie (2005), the Mnet also tends to select or drop highly correlated predictors together. However, unlike the elastic net, the Mnet is selection consistent and equal to the oracle ridge estimator with high probability under reasonable conditions. We develop an efficient coordinate descent algorithm to compute the Mnet estimates. Simulation studies show that the Mnet has better performance in the presence of highly correlated predictors than either the elastic net or MCP. Finally, we illustrate the application of the Mnet to real data from a gene expression study in ophthalmology.

Research paper thumbnail of Strong Rules for Nonconvex Penalties and Their Implications for Efficient Algorithms in High-Dimensional Regression

Journal of Computational and Graphical Statistics, 2015

We consider approaches for improving the efficiency of algorithms for fitting nonconvex penalized... more We consider approaches for improving the efficiency of algorithms for fitting nonconvex penalized regression models such as SCAD and MCP in high dimensions. In particular, we develop rules for discarding variables during cyclic coordinate descent. This dimension reduction leads to a substantial improvement in the speed of these algorithms for high-dimensional problems. The rules we propose here eliminate a substantial fraction of the variables from the coordinate descent algorithm. Violations are quite rare, especially in the locally convex region of the solution path, and furthermore, may be easily detected and corrected by checking the Karush-Kuhn-Tucker conditions. We extend these rules to generalized linear models, as well as to other nonconvex penalties such as the 2-stabilized Mnet penalty, group MCP, and group SCAD. We explore three variants of the coordinate decent algorithm that incorporate these rules and study the efficiency of these algorithms in fitting models to both simulated data and on real data from a genome-wide association study.

Research paper thumbnail of Efficient Algorithms

rules for nonconvex penalties and their implications for

Research paper thumbnail of The Mnet method for variable selection

Statistica Sinica, 2016

We propose a penalized approach for variable selection using a combination of minimax concave and... more We propose a penalized approach for variable selection using a combination of minimax concave and ridge penalties. The method is designed to deal with p ≥ n problems with highly correlated predictors. We call this the Mnet method. Similar to the elastic net of Zou and Hastie (2005), the Mnet tends to select or drop highly correlated predictors together. However, unlike the elastic net, the Mnet is selection consistent and equal to the oracle ridge estimator with high probability under reasonable conditions. We develop an efficient coordinate descent algorithm to compute the Mnet estimates. Simulation studies show that the Mnet has better performance in the presence of highly correlated predictors than either the elastic net or MCP. We illustrate the application of the Mnet to data from a gene expression study in ophthalmology.

Research paper thumbnail of Balancing stability and bias reduction in variable selection with the mnet estimator

We propose a new penalized approach for variable selection using a combination of minimax concave... more We propose a new penalized approach for variable selection using a combination of minimax concave and ridge penalties. The proposed method is designed to deal with p ≥ n problems with highly correlated predictors. We call the propose approach the Mnet method. Similar to the elastic net of Zou and Hastie (2005), the Mnet also tends to select or drop highly correlated predictors together. However, unlike the elastic net, the Mnet is selection consistent and equal to the oracle ridge estimator with high probability under reasonable conditions. We develop an efficient coordinate descent algorithm to compute the Mnet estimates. Simulation studies show that the Mnet has better performance in the presence of highly correlated predictors than either the elastic net or MCP. Finally, we illustrate the application of the Mnet to real data from a gene expression study in ophthalmology.

Research paper thumbnail of Strong Rules for Nonconvex Penalties and Their Implications for Efficient Algorithms in High-Dimensional Regression

Journal of Computational and Graphical Statistics, 2015

We consider approaches for improving the efficiency of algorithms for fitting nonconvex penalized... more We consider approaches for improving the efficiency of algorithms for fitting nonconvex penalized regression models such as SCAD and MCP in high dimensions. In particular, we develop rules for discarding variables during cyclic coordinate descent. This dimension reduction leads to a substantial improvement in the speed of these algorithms for high-dimensional problems. The rules we propose here eliminate a substantial fraction of the variables from the coordinate descent algorithm. Violations are quite rare, especially in the locally convex region of the solution path, and furthermore, may be easily detected and corrected by checking the Karush-Kuhn-Tucker conditions. We extend these rules to generalized linear models, as well as to other nonconvex penalties such as the 2-stabilized Mnet penalty, group MCP, and group SCAD. We explore three variants of the coordinate decent algorithm that incorporate these rules and study the efficiency of these algorithms in fitting models to both simulated data and on real data from a genome-wide association study.

Log In