$\require{mediawiki-texvc}$

연합인증

연합인증 가입 기관의 연구자들은 소속기관의 인증정보(ID와 암호)를 이용해 다른 대학, 연구기관, 서비스 공급자의 다양한 온라인 자원과 연구 데이터를 이용할 수 있습니다.

이는 여행자가 자국에서 발행 받은 여권으로 세계 각국을 자유롭게 여행할 수 있는 것과 같습니다.

연합인증으로 이용이 가능한 서비스는 NTIS, DataON, Edison, Kafe, Webinar 등이 있습니다.

한번의 인증절차만으로 연합인증 가입 서비스에 추가 로그인 없이 이용이 가능합니다.

다만, 연합인증을 위해서는 최초 1회만 인증 절차가 필요합니다. (회원이 아닐 경우 회원 가입이 필요합니다.)

연합인증 절차는 다음과 같습니다.

최초이용시에는
ScienceON에 로그인 → 연합인증 서비스 접속 → 로그인 (본인 확인 또는 회원가입) → 서비스 이용

그 이후에는
ScienceON 로그인 → 연합인증 서비스 접속 → 서비스 이용

연합인증을 활용하시면 KISTI가 제공하는 다양한 서비스를 편리하게 이용하실 수 있습니다.

Gradient boosting machines, a tutorial 원문보기

Frontiers in neurorobotics, v.7, 2013년, pp.21 -   

Natekin, Alexey (fortiss GmbH Munich, Germany) ,  Knoll, Alois (Department of Informatics, Technical University Munich Garching, Munich, Germany)

Abstract AI-Helper 아이콘AI-Helper

Gradient boosting machines are a family of powerful machine-learning techniques that have shown considerable success in a wide range of practical applications. They are highly customizable to the particular needs of the application, like being learned with respect to different loss functions. This a...

주제어

참고문헌 (55)

  1. Bache K. Lichman M. ( 2013 ). UCI Machine Learning Repository . Irvine, CA : University of California, School of Information and Computer Sciences Available online at: http://archive.ics.uci.edu/ml/citation_policy.html 

  2. Bissacco A. Yang M.-H. Soatto S. ( 2007 ). Fast human pose estimation using appearance and motion via multi-dimensional boosting regression , in IEEE Conference on Computer Vision and Pattern Recognition, CVPR'07 . ( Minneapolis, MN ). 10.1109/CVPR.2007.383129 

  3. Breiman L. ( 2001 ). Random forests . Mach. Learn . 45 , 5 – 32 10.1023/A:1010933404324 

  4. Breiman L. Friedman J. H. Olshen R. A. Stone C. J. ( 1983 ). Classification and Regression Trees . Belmont, CA : Wadsworth Publishing 

  5. Buhlmann P. ( 2006 ). Boosting for high-dimensional linear models . Ann. Stat . 34 , 559 – 583 10.1214/009053606000000092 

  6. Bullmore E. Sporns O. ( 2009 ). Complex brain networks: graph theoretical analysis of structural and functional systems . Nat. Rev. Neurosci . 10 , 186 – 198 10.1038/nrn2575 19190637 

  7. Burges C. J. C. Ragno R. Le Q. V. ( 2006 ). Learning to rank with non-smooth cost functions , in Advances in Neural Information Processing Systems , eds Schölkopf B. Platt J. C. Hoffman T. ( Cambridge, MA : MIT Press ), 193 – 200 Available online at: http://books.nips.cc/papers/files/nips19/NIPS2006_0574.pdf 

  8. Chen H. Tino P. Yao X. ( 2009 ). Predictive ensemble pruning by expectation propagation . IEEE Trans. Knowl. Data Eng . 7 , 999 – 1013 10.1109/TKDE.2009.62 

  9. Ciarelli P. Oliveira E. ( 2009 ). Agglomeration and elimination of terms for dimensionality reduction , in Ninth International Conference on Intelligent Systems Design and Applications, ISDA'09 ( Pisa ), 547 – 552 10.1109/ISDA.2009.9 

  10. Ciarelli P. Salles E. Oliveira E. ( 2010 ). An evolving system based on probabilistic neural network , in Eleventh Brazilian Symposium on Neural Networks (SBRN) ( Sao Paulo ), 182 – 187 10.1109/SBRN.2010.39 

  11. Clemencon S. Vayatis N. ( 2009 ). Tree-based ranking methods . IEEE Trans. Inf. Theory 55 , 4316 – 4336 10.1109/TIT.2009.2025558 20927193 

  12. Cotter A. Shamir O. Srebro N. Sridharan K. ( 2011 ). Better mini-batch algorithms via accelerated gradient methods , in Advances in Neural Information Processing Systems 24 eds Shawe-Taylor J. Zemel R. Bartlett P. Pereira F. Weinberger K. ( Cambridge, MA : MIT Press ), 1647 – 1655 Available online at: http://books.nips.cc/papers/files/nips24/NIPS2011_0942.pdf 

  13. De'ath G. ( 2007 ). Boosted trees for ecological modeling and prediction . Ecology 88 , 243 – 251 10.1890/0012-9658(2007)88[243:BTFEMA]2.0.CO;2 17489472 

  14. Dietterich T. G. Ashenfelter T. D. A. Bulatov Y. ( 2004 ). Training conditional random fields via gradient tree boosting , in Proceedings of the 21st International Conference on Machine Learning (ICML) , ( Banff, AB ). Available online at: http://citeseerx.ist.psu.edu/viewdoc/download?rep=rep1&type=pdf&doi=10.1.1.58.6703 

  15. Du J. Hu Y. Jiang H. ( 2011 ). Boosted mixture learning of Gaussian mixture Hidden Markov models based on maximum likelihood for speech recognition . IEEE Trans. Audio Speech Lang. Process . 19 , 2091 – 2100 10.1109/TASL.2011.2112352 

  16. Fahlman S. Lebiere C. ( 1989 ). The Cascade-Correlation Learning Architecture . Technical Report, Carnegie Mellon University , Pittsburgh, PA 

  17. Fanelli G. Dantone M. Gall J. Fossati A. Gool L. ( 2012 ). Random forests for real time 3D face analysis . Int. J. Comput. Vis . 1 , 1 – 22 10.1007/s11263-012-0549-0 

  18. Freund Y. Schapire R. ( 1997 ). A decision-theoretic generalization of on-line learning and an application to boosting . J. Comput. Syst. Sci . 55 , 119 – 139 

  19. Friedman J. ( 2001 ). Greedy boosting approximation: a gradient boosting machine . Ann. Stat . 29 , 1189 – 1232 10.1214/aos/1013203451 

  20. Friedman J. Hastie T. Tibshirani R. ( 2000 ). Additive logistic regression: a statistical view of boosting . Ann. Stat . 28 , 337 – 407 10.1214/aos/1016218222 

  21. Fruchterman T. M. J. Reingold E. M. ( 1991 ). Graph drawing by force-directed placement . Softw. Pract. Exper . 21 , 1129 – 1164 10.1002/spe.4380211102 16805262 

  22. Hansen L. Salamon P. ( 1990 ). Neural network ensembles . IEEE Trans. Pattern Anal. Mach. Intell . 12 , 993 – 1001 10.1109/34.58871 

  23. Hastie T. ( 2007 ). Comment: boosting algorithms: regularization, prediction and model fitting . Stat. Sci . 22 , 513 – 515 10.1214/07-STS242A 

  24. Hofner B. Mayr A. Robinzonov N. Schmid M. ( 2012 ). Model-Based Boosting in R: a Hands-on Tutorial Using the R Package Mboost . Technical Report, Department of Statistics, University of Munich 10.1007/s00180-012-0382-5 

  25. Hothorn T. Buhlmann P. Kneib T. Schmid M. Hofner B. ( 2010 ). Model-based boosting 2.0 . J. Mach. Learn. Res . 11 , 2109 – 2113 Available online at: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.168.2648 

  26. Hu T. Li X. Zhao Y. ( 2007 ). Gradient boosting learning of Hidden Markov models , in Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP'06) ( Toulouse ). 10.1109/ICASSP.2006.1660233 

  27. Hu Y. F. ( 2005 ). Efficient and high quality force-directed graph drawing . Math. J . 10 , 37 – 71 Available online at: http://www.mathematica-journal.com/issue/v10i1/graph_draw.html 

  28. Hutchinson R. A. Liu L.-P. Dietterich T. G. ( 2011 ). Incorporating boosted regression trees into ecological latent variable models , in AAAI'11 , ( San Francisco, CA ), 1343 – 1348 Available online at: http://www.aaai.org/ocs/index.php/AAAI/AAAI11/paper/view/3711 

  29. Johnson R. Zhang T. ( 2012 ). Learning Nonlinear Functions Using Regularized Greedy Forest . Technical Report. arXiv:1109.0887. 10.2172/1052139 

  30. Koenker R. Hallock K. F. ( 2001 ). Quantile regression . V J. Econ. Perspect . 15 , 143 – 156 10.1257/jep.15.4.143 19418223 

  31. Kulkarni V. Sinha P. ( 2012 ). Pruning of random forest classifiers: a survey and future directions , in International Conference on Data Science Engineering (ICDSE) ( Cochin, Kerala ), 64 – 68 10.1109/ICDSE.2012.6282329 

  32. Latora V. Marchiori M. ( 2001 ). Efficient behavior of small-world networks . Phys. Rev. Lett . 87 : 198701 10.1103/PhysRevLett.87.198701 11690461 

  33. Lewickiy M. S. ( 1998 ). A review of methods for spike sorting: the detection and classification of neural action potentials . Netw. Comput. Neural Syst . 9 , 53 – 78 10.1088/0954-898X/9/4/001 10221571 

  34. Li C. Chen G. ( 2003 ). Stability of a neural network model with small-world connections . Phys. Rev. E 68 : 052901 10.1103/PhysRevE.68.052901 14682827 

  35. Liu Y. Wang Y. Li Y. Zhang B. Wu G. ( 2004 ). Earthquake prediction by RBF neural network ensemble , in Advances in Neural Networks - ISNN 2004 , eds Yin F.-L. Wang J. Guo C. ( Berlin; Heidelberg : Springer ), 962 – 969 10.1007/978-3-540-28648-6_153 

  36. Lotte F. Congedo M. Lécuyer A. Lamarche F. Arnaldi B. ( 2007 ). A review of classification algorithms for EEG-based brain-computer interfaces . J. Neural Eng . 4 , R1 – R13 10.1088/1741-2560/4/2/R01 17409472 

  37. Pittman S. J. Brown K. A. ( 2011 ). Multi-scale approach for predicting fish species distributions across coral reef seascapes . PLoS ONE 6 : e20583 10.1371/journal.pone.0020583 21637787 

  38. Qi Y. ( 2012 ). Random forest for bioinformatics , in Ensemble Machine Learning , eds Zhang C. Ma Y. ( New York, NY : Springer ), 307. 10.1007/978-1-4419-9326-7_11 

  39. Rifkin R. Klautau A. ( 2004 ). In defense of one-vs-all classification . J. Mach. Learn. Res . 5 , 101 – 141 Available online at: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.121.174 

  40. Runnalls A. R. ( 2007 ). A Kullback-Leibler approach to Gaussian mixture reduction . IEEE Trans. Aerosp. Electron. Syst . 43 , 989 – 999 10.1109/TAES.2007.4383588 

  41. Schapire R. ( 2002 ). The boosting approach to machine learning: an overview . Nonlin. Estimat. Classif. Lect. Notes Stat . 171 , 149 – 171 10.1007/978-0-387-21579-2_9 

  42. Schmid M. Hothorn T. ( 2007 ). Boosting Additive Models Using Component-Wise p-Splines . Technical Report, Department of Statistics, University of Munich 10.1016/j.csda.2008.09.009 

  43. Schmid M. Hothorn T. ( 2008 ). Flexible boosting of accelerated failure time models .. BMC Bioinformatics 9 , 269 10.1186/1471-2105-9-269 18538026 

  44. Schmid M. Hothorn T. Maloney K. O. Weller D. E. Potapov S. ( 2011 ). Geoadditive regression modeling of stream biological condition . Environ. Ecol. Stat . 18 , 709 – 733 10.1007/s10651-010-0158-4 

  45. Sewell M. ( 2011 ). Ensemble Learning . Technical Report, Department of Computer Science, University College London Available online at: http://www.cs.ucl.ac.uk/fileadmin/UCL-CS/research/Research_Notes/RN_11_02.pdf 

  46. Shibata R. ( 1997 ). Bootstrap estimate of Kullback-Leibler information for model selection . Stat. Sin . 7 , 375 – 394 

  47. Shu C. Burn D. H. ( 2004 ). Artificial neural network ensembles and their application in pooled flood frequency analysis . Water Resour. Res . 40 , 1 – 10 10.1029/2003WR002816 

  48. Simard D. Nadeau L. Kröger H. ( 2005 ). Fastest learning in small-world neural networks . Phys. Lett. A 336 , 8 – 15 10.1016/j.physleta.2004.12.078 

  49. Sutton C. D. ( 2005 ). Classification and regression trees, bagging, and boosting . Handb. Stat . 24 , 303 – 329 10.1016/S0169-7161(04)24011-1 

  50. Viola P. Jones M. ( 2001 ). Rapid object detection using a boosted cascade of simple features , in Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2001 , ( Kauai, HI ). 10.1109/CVPR.2001.990517 

  51. Vogel J. Castellini C. vander Smagt P. ( 2011 ). EMG-based teleoperation and manipulation with the DLR LWR-III , in 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) ( San Francisco, CA ). 10.1109/IROS.2011.6094739 

  52. Wenxin J. ( 2002 ). On weak base hypotheses and their implications for boosting regression and classification . Ann. Stat . 30 , 51 – 73 Available online at: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.134.9366 

  53. Yao X. ( 1993 ). A review of evolutionary artificial neural networks . Int. J. Intell. Syst . 8 , 539 – 567 10.1002/int.4550080406 

  54. Zhang T. Yu B. ( 2005 ). Boosting with early stopping: convergence and consistency . Ann. Stat . 33 , 1538 – 1579 10.1214/009053605000000255 

  55. Zou H. Hastie T. ( 2005 ). Regularization and variable selection via the elastic net . J. R. Stat. Soc. B (Methodological) 67 , 301 – 320 10.1111/j.1467-9868.2005.00503.x 

관련 콘텐츠

오픈액세스(OA) 유형

GOLD

오픈액세스 학술지에 출판된 논문

저작권 관리 안내
섹션별 컨텐츠 바로가기

AI-Helper ※ AI-Helper는 오픈소스 모델을 사용합니다.

AI-Helper 아이콘
AI-Helper
안녕하세요, AI-Helper입니다. 좌측 "선택된 텍스트"에서 텍스트를 선택하여 요약, 번역, 용어설명을 실행하세요.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.

선택된 텍스트

맨위로