Acknowledgement
본 논문은 육군사관학교 화랑대연구소의 2021년도(21-군학-19) 연구활동비 지원을 받아 연구되었음.
References
- Alhamzawi R (2015). Model selection in quantile regression models, Journal of Applied Statistics, 42, 445-458. https://doi.org/10.1080/02664763.2014.959905
- Bang S and Shin S (2016). A comparison study of multiple linear quantile regression using non-crossing constraints, The Korean Journal of Applied Statistics, 29, 773-786. https://doi.org/10.5351/KJAS.2016.29.5.773
- Belloni A and Chernozhukov V (2011). L1-penalized quantile regression in high-dimensional sparse models, The Annals of Statistics, 39, 82-130. https://doi.org/10.1214/10-AOS827
- Bertin-Mahieux T, Ellis DP, Whitman B, and Lamere P (2011). The million song dataset. In Proceedings of the 12th International Conference on Music Information Retrieval (IS-MIR).
- Chen L and Zhou Y (2020). Quantile regression in big data: A divide and conquer based strategy, Computational Statistics & Data Analysis, 144, 106892. https://doi.org/10.1016/j.csda.2019.106892
- Chen X and Xie MG (2014). A split-and-conquer approach for analysis of extraordinarily large data, Statistica Sinica, 165-1684.
- Cole T and Green P (1992). Smoothing reference centile curves: The LMS method and penalized likelihood, Statistics in Medicine, 11, 1305-1319. https://doi.org/10.1002/sim.4780111005
- Cook L (2014). Gendered parenthood penalties and premiums across the earnings distribution in Australia, the United Kingdom, and the United States. European Sociological Review, 30, 360-372. https://doi.org/10.1093/esr/jcu044
- Dasgupta A, Drineas P, Harb B, Kumar R, and Mahoney MW (2009). Sampling algorithms and corsets for lp regression, SIAM Journal on Computing, 38, 2060-2078. https://doi.org/10.1137/070696507
- Fan J and Li R (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American statistical Association, 96, 1348-1360. https://doi.org/10.1198/016214501753382273
- Fan TH, Lin DK, and Cheng KF (2007). Regression analysis for massive datasets, Data & Knowledge Engineering, 61, 554-562. https://doi.org/10.1016/j.datak.2006.06.017
- Frank I and Friedman J (1993). A statistical view of some chemometrics regression tools, Technometrics, 35, 109-148. https://doi.org/10.1080/00401706.1993.10485033
- Heagerty P and Pepe M (1999). Semiparametric estimation of regression quantiles with application to standardizing weight for height and age in U.S. children, The Journal of the Royal Statistical Society, Series C (Applied Statistics), 48, 533-551. https://doi.org/10.1111/1467-9876.00170
- Jiang R, Qian W, and Zhou Z (2016). Single-index composite quantile regression with heteroscedasticity and general error distributions, Statistical Papers, 57, 185-203. https://doi.org/10.1007/s00362-014-0646-y
- Jung BH and Lim DH (2017). Comparison analysis of big data integration models, Journal of the Korean Data & Information Science Society, 28, 755-768.
- Kang J and Jhun M (2020). Divide-and-conquer random sketched kernel ridge regression for large-scale data analysis, Journal of Korean Data & Information Science Society, 31, 15-23. https://doi.org/10.7465/jkdi.2020.31.1.15
- Kim KH and Shin SJ (2017). Adaptive ridge procedure for L0-penalized weighted support vector machines, Journal of the Korean Data & Information Science Society, 28, 1271-1278. https://doi.org/10.7465/jkdi.2017.28.6.1271
- Koenker R and Ng P (2005). A Frisch-Newton algorithm for sparse quantile regression, Acta Mathematicae Applicatae Sinica, 21, 225-236. https://doi.org/10.1007/s10255-005-0231-1
- Lee E, Noh H and Park B (2014). Model selection via Bayesian information criterion for quantile regression models, Journal of the American Statistical Association, 109, 216-229. https://doi.org/10.1080/01621459.2013.836975
- Li R, Lin DK, and Li B (2013). Statistical inference in massive data sets, Applied Stochastic Models in Business and Industry, 29, 399-409. https://doi.org/10.1002/asmb.1927
- Li Y (2008). L1-norm quantile regression, Journal of Computational and Graphical Statistics, 17, 163-185. https://doi.org/10.1198/106186008X289155
- Ning Z and Tang L (2014). Estimation and test procedures for composite quantile regression with covariates missing at random, Statistics & Probability Letters, 95, 15-25. https://doi.org/10.1016/j.spl.2014.08.003
- Okada K and Samreth S (2012). The effect of foreign aid on corruption: a quantile regression approach, Economics Letters, 115, 240-243. https://doi.org/10.1016/j.econlet.2011.12.051
- Peng B and Wang L (2015). An iterative coordinate descent algorithm for high-dimensional nonconvex penalized quantile regression, Journal of Computational and Graphical Statistics, 24, 676-694. https://doi.org/10.1080/10618600.2014.913516
- Portnoy S and Koenker R (1997). The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators, Statistical Science, 12, 279-300. https://doi.org/10.1214/ss/1030037960
- Powell D and Wagner J (2014). The exporter productivity premium along the productivity distribution: evidence from quantile regression with nonadditive firm fixed effects, Review of World Economics, 150, 763-785. https://doi.org/10.1007/s10290-014-0192-7
- Scott SL, Blocker AW, Bonassi FV, Chipman HA, George EI, and McCulloch RE (2016). Bayes and big data: The consensus Monte Carlo algorithm, International Journal of Management Science and Engineering Management, 11, 78-88.
- Scott SL (2017). Comparing consensus Monte Carlo strategies for distributed Bayesian computation, Brazilian Journal of Probability and Statistics, 31, 668-685. https://doi.org/10.1214/17-BJPS365
- Srivastava S, Cevher V, Dinh Q, and Dunson D (2015). WASP: Scalable Bayes via barycenters of subset posteriors, In Artificial Intelligence and Statistics, 912-920.
- Tibshirani R (1996). Regression shrinkage and selection via the LASSO, Journal of the Royal Statistical Society: Series B, 58, 267-288.
- Wang H and Leng C (2007). Unified Lasso estimation by least squares approximation, Journal of the American Statistical Association, 102, 1418-1429.
- Wang H, Li B, and Leng C (2009). Shrinkage tuning parameter selection with a diverging number of parameters, Journal of the Royal Statistical Society, Series B, 71, 671-683. https://doi.org/10.1111/j.1467-9868.2008.00693.x
- Wang H, Li R, and Tsai CL (2007). Tuning parameter Selectors for the Smoothly Clipped Absolute Deviation Method, Biometrika, 94, 553-568. https://doi.org/10.1093/biomet/asm053
- Wang L, Wu Y, and Li R (2012). Quantile regression for analyzing Heterogeneity in Ultra-High Dimension, Journal of American Statistical Association, 107, 214-222. https://doi.org/10.1080/01621459.2012.656014
- Wu Y and Liu Y (2009). Variable selection in quantile regression. Statistica Sinica, 801-817.
- Xu Q, Cai C, Jiang C, Sun F, and Huang X (2020). Block average quantile regression for massive dataset, Statistical Papers, 61, 141-165. https://doi.org/10.1007/s00362-017-0932-6
- Xue J and Liang F (2019). Double-parallel Monte Carlo for Bayesian analysis of big data, Statistics and Computing, 29, 23-32 https://doi.org/10.1007/s11222-017-9791-1
- Yang J, Meng X, and Mahoney MW (2014). Quantile regression for large-scale applications, SIAM Journal on Scientific Computing, 36, 78-110.
- Zhang Y, Duchi J, and Wainwright M (2015). Divide and conquer kernel ridge regression: A distributed algorithm with minimax optimal rates. The Journal of Machine Learning Research, 16, 3299-3340.