Adaptive Lad Lasso, Split Regularized Regression and DLasso: Simulation Study of Variable Selection
Abstract
In this paper, we compare three different main methods for selecting variables for linear regression models: Adaptive Lad Lasso, Split Regularized Regression (SRR) and DLasso (AIC, GIC, BIC, CGV). In a simulation study, we show the performance of the methods considering the median model error. The case where the number of candidate variables exceeds the number of observations is considered as well. Also, the simulation study is used in determining which methods are best in all of the linear regression scenarios.
References
- Arnold, T. B., and Tibshirani, R. J. (2016). Efficient implementations of the generalized Lasso dual-path algorithm, Journal of Computational and Graphical Statistics, 25(1):127.
- Christidis, A.-A., Lakshmanan, L., Smucler, E., and Zamar, R. (2020)). Split regularized regression, Technometrics 62.3, pp. 330338.
- Fan, J. and R. Li (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association 96(456), 13481360.
- Fujisawa, H. and Eguchi, S. (2008). Robust parameter estimation with a small bias against heavy contamination, Journal of Multivariate Analysis, 99(9), 2053-2081.
- Haselimashhadi,H. and Vinciotti,V.(2016).A Differentiable Alternative to the Lasso Penalty, https://arxiv.org/abs/1609.04985#:~:text=Regularized%20regression%20has%20become%20very,inference%20where%20traditional%20methods%20fail.
- Koenker, R. and G. W. Bassett (1978). Regression quantiles, Econometrica 46, 3350.
- Lambert-Lacroix, S. and Zwald, L. (2011). Robust regression through Hubers criterion and adaptive Lasso penalty, Electronic Journal of Statistics 5, 10151053.
- Qin, Y., Li, S. and Yu, Y. (2017). Penalized Maximum Tangent Likelihood Estimation and Robust Variable Selection, https://arxiv.org/pdf/1708.05439.pdf.
- Rosset, S. and Zhu, J. (2007). Piecewise linear regularized solution paths, The Annals of Statistics 35 (3), 10121030.
- Taddy, M. (2017). One-step estimator paths for concave regularization, Journal of Computational and Graphical Statistics pp. 112.
- Tibshirani, R. J., and Taylor, J. (2011), The solution path of the generalized Lasso, Ann.Stat., 39(3), 1335-1371.
- Wang, H., Li, G., and Jiang, G. (2007). Robust regression shrinkage and consistent variable selection through the LAD-Lasso, Journal of Business & Economic Statistics 25, 347 - 355.
- Yu, K., C. Cathy, C. Reed, and D. Dunson (2013). Bayesian variable selection in quantile regression, Statistics and Its Interface 6, 261274[17]
- Zhu, W., Levy-Leduc, C., and Tern`es, N. (2021).A variable selection approach for highly correlated predictors in high-dimensional genomic data, Bioinformatics, 37(16), 2238 2244.
- Zou, H. (2006). The adaptive Lasso and its oracle properties, Journal of the American Statistical Association 101, 14181429.