최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기응용통계연구 = The Korean journal of applied statistics, v.35 no.2, 2022년, pp.217 - 227
강종경 (강원대학교 정보통계학전공) , 한석원 (육군사관학교 수학과) , 방성완 (육군사관학교 수학과)
Quantile regression is widely used in many fields based on the advantage of providing an efficient tool for examining complex information latent in variables. However, modern large-scale and high-dimensional data makes it very difficult to estimate the quantile regression model due to limitations in...
Alhamzawi R (2015). Model selection in quantile regression models, Journal of Applied Statistics, 42, 445-458.
Belloni A and Chernozhukov V (2011). L1-penalized quantile regression in high-dimensional sparse models, The Annals of Statistics, 39, 82-130.
Bertin-Mahieux T, Ellis DP, Whitman B, and Lamere P (2011). The million song dataset. In Proceedings of the 12th International Conference on Music Information Retrieval (IS-MIR).
Chen L and Zhou Y (2020). Quantile regression in big data: A divide and conquer based strategy, Computational Statistics & Data Analysis, 144, 106892.
Chen X and Xie MG (2014). A split-and-conquer approach for analysis of extraordinarily large data, Statistica Sinica, 165-1684.
Cole T and Green P (1992). Smoothing reference centile curves: The LMS method and penalized likelihood, Statistics in Medicine, 11, 1305-1319.
Cook L (2014). Gendered parenthood penalties and premiums across the earnings distribution in Australia, the United Kingdom, and the United States. European Sociological Review, 30, 360-372.
Dasgupta A, Drineas P, Harb B, Kumar R, and Mahoney MW (2009). Sampling algorithms and corsets for lp regression, SIAM Journal on Computing, 38, 2060-2078.
Fan J and Li R (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American statistical Association, 96, 1348-1360.
Fan TH, Lin DK, and Cheng KF (2007). Regression analysis for massive datasets, Data & Knowledge Engineering, 61, 554-562.
Frank I and Friedman J (1993). A statistical view of some chemometrics regression tools, Technometrics, 35, 109-148.
Heagerty P and Pepe M (1999). Semiparametric estimation of regression quantiles with application to standardizing weight for height and age in U.S. children, The Journal of the Royal Statistical Society, Series C (Applied Statistics), 48, 533-551.
Jiang R, Qian W, and Zhou Z (2016). Single-index composite quantile regression with heteroscedasticity and general error distributions, Statistical Papers, 57, 185-203.
Jung BH and Lim DH (2017). Comparison analysis of big data integration models, Journal of the Korean Data & Information Science Society, 28, 755-768.
Kang J and Jhun M (2020). Divide-and-conquer random sketched kernel ridge regression for large-scale data analysis, Journal of Korean Data & Information Science Society, 31, 15-23.
Koenker R and Ng P (2005). A Frisch-Newton algorithm for sparse quantile regression, Acta Mathematicae Applicatae Sinica, 21, 225-236.
Lee E, Noh H and Park B (2014). Model selection via Bayesian information criterion for quantile regression models, Journal of the American Statistical Association, 109, 216-229.
Li R, Lin DK, and Li B (2013). Statistical inference in massive data sets, Applied Stochastic Models in Business and Industry, 29, 399-409.
Li Y (2008). L1-norm quantile regression, Journal of Computational and Graphical Statistics, 17, 163-185.
Ning Z and Tang L (2014). Estimation and test procedures for composite quantile regression with covariates missing at random, Statistics & Probability Letters, 95, 15-25.
Okada K and Samreth S (2012). The effect of foreign aid on corruption: a quantile regression approach, Economics Letters, 115, 240-243.
Peng B and Wang L (2015). An iterative coordinate descent algorithm for high-dimensional nonconvex penalized quantile regression, Journal of Computational and Graphical Statistics, 24, 676-694.
Portnoy S and Koenker R (1997). The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators, Statistical Science, 12, 279-300.
Powell D and Wagner J (2014). The exporter productivity premium along the productivity distribution: evidence from quantile regression with nonadditive firm fixed effects, Review of World Economics, 150, 763-785.
Scott SL, Blocker AW, Bonassi FV, Chipman HA, George EI, and McCulloch RE (2016). Bayes and big data: The consensus Monte Carlo algorithm, International Journal of Management Science and Engineering Management, 11, 78-88.
Scott SL (2017). Comparing consensus Monte Carlo strategies for distributed Bayesian computation, Brazilian Journal of Probability and Statistics, 31, 668-685.
Srivastava S, Cevher V, Dinh Q, and Dunson D (2015). WASP: Scalable Bayes via barycenters of subset posteriors, In Artificial Intelligence and Statistics, 912-920.
Tibshirani R (1996). Regression shrinkage and selection via the LASSO, Journal of the Royal Statistical Society: Series B, 58, 267-288.
Wang H and Leng C (2007). Unified Lasso estimation by least squares approximation, Journal of the American Statistical Association, 102, 1418-1429.
Wang H, Li B, and Leng C (2009). Shrinkage tuning parameter selection with a diverging number of parameters, Journal of the Royal Statistical Society, Series B, 71, 671-683.
Wang H, Li R, and Tsai CL (2007). Tuning parameter Selectors for the Smoothly Clipped Absolute Deviation Method, Biometrika, 94, 553-568.
Wang L, Wu Y, and Li R (2012). Quantile regression for analyzing Heterogeneity in Ultra-High Dimension, Journal of American Statistical Association, 107, 214-222.
Wu Y and Liu Y (2009). Variable selection in quantile regression. Statistica Sinica, 801-817.
Xu Q, Cai C, Jiang C, Sun F, and Huang X (2020). Block average quantile regression for massive dataset, Statistical Papers, 61, 141-165.
Xue J and Liang F (2019). Double-parallel Monte Carlo for Bayesian analysis of big data, Statistics and Computing, 29, 23-32
Yang J, Meng X, and Mahoney MW (2014). Quantile regression for large-scale applications, SIAM Journal on Scientific Computing, 36, 78-110.
Zhang Y, Duchi J, and Wainwright M (2015). Divide and conquer kernel ridge regression: A distributed algorithm with minimax optimal rates. The Journal of Machine Learning Research, 16, 3299-3340.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
Free Access. 출판사/학술단체 등이 허락한 무료 공개 사이트를 통해 자유로운 이용이 가능한 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.