최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기Journal of Korean Society of Industrial and Systems Engineering = 한국산업경영시스템학회지, v.44 no.2, 2021년, pp.15 - 23
홍정식 (서울과학기술대학교 산업공학과) , 황근성 (서울과학기술대학교 일반대학원 데이터사이언스학과)
Most of the open-source decision tree algorithms are based on three splitting criteria (Entropy, Gini Index, and Gain Ratio). Therefore, the advantages and disadvantages of these three popular algorithms need to be studied more thoroughly. Comparisons of the three algorithms were mainly performed wi...
An, A. and Cercone, N., Rule Quality Measures for Rule Induction Systems, Computational Intelligence, 2001, Vol. 17, No. 3, pp. 409-424.
Baesens, B., Mues, C., De Backer, M., and Vanthienen, J., Building Intelligent Credit Scoring Systems Using Decision Tables, In : Enterprise Information Systems V, 2004, pp. 131-137.
Belle, V. and Papantonis, I., Principles and Practice of Explainable Machine Learning, arXiv preprint arXiv: 2009.11698, 2020.
Breiman, L., Friedman, J.H., Olshen, R.A., and Stone, C.J., Classification and Regression Trees, Monterey, CA, USA : Wadsworth and Brooks, 1984.
Breiman, L., Heuristics of Instability and Stabilization in Model Selection, The Annals of Statistics, 1996, Vol. 24, No. 6, pp. 2350-2383.
Briand, B., Ducharme, G.R., Parache, V., and Mercat-Rommens, C., A Similarity Measureto Assess the Stability of Classification Trees, Computational Statistics and Data Analysis, 2009, Vol. 53, No. 4, pp. 1208-1217.
Buntine, W. and Niblett, T., A Further Comparison of Splitting Rules for Decision-Tree Induction, Machine Learning, 1992, Vol. 8, No. 1, pp. 75-85.
Cano, A., Zafra, A., and Ventura, S., An Interpretable Classification Rule Mining Algorithm, Information Sciences, 2013, Vol. 240, pp. 1-20.
Chandra, B., Kothari, R., and Paul, P., A New Node Splitting Measure for Decision Tree Construction, Pattern Recognition, 2010, Vol. 43, No. 8, pp. 2725-2731.
Dannegger, F., Tree Stability Diagnotics and Some Remedies for Instability, Statistics in Medicine, 2000, Vol. 19, No. 4, pp. 475-491.
Freitas, A.A., Comprehensible Classification Models : A Position Paper, ACM SIGKDD Explorations Newsletter, 2014, Vol. 15, No. 1, pp. 1-10.
Garcia, S., Fernandez, A., and Herrera, F., Enhancing the Effectiveness and Interpretability of Decision Tree and Rule Induction Classifiers with Evolutionary Training Set Selection over Imbalanced Problems, Applied Soft Computing, 2009, Vol. 9, No. 4, pp. 1304-1314.
Goldstein, A. and Buja, A., Penalized Split Criteria for Interpretable Trees, arXiv preprint arXiv:1310.5677, 2013.
Harris, E., Information Gain Versus Gain Ratio : A Study of Split Method Biases, The MITRE Corp., McLean, VI, USA, Tech. Rep., 2001.
Jacobucci, R., Decision Tree Stability and its Effect on Interpretation, Retrieved from osf.io/m5p2v, 2018.
Jaworski, M., Duda, P., and Rutkowski, L., New Splitting Criteria for Decision Trees in Stationary Data Streams, IEEE Trans. Neural Netw. Learn. Syst., 2018, Vol. 29, No. 6, pp. 2516-2529.
Kotsiantis, S.B., Supervised Machine Learning : A Review of Classification Techniques, Informatica, 2007, Vol. 31, No. 3, pp. 249-268.
Leiva, R.G., Anta, A.F., Mancuso, V., and Casari, P., A Novel Hyperparameter-Free Approach to Decision Tree Construction that Avoids Overfitting by Design, IEEE Access, 2019, Vol. 7, pp. 99978-99987.
Li, R.-H. and Belford, G.G., Instability of Decision Tree Classification Algorithms, In Proceedings of the 8 th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Edmonton, Alberta, Canada, 2002, pp. 570-575.
Lustrek, M., Gams, M., and Martincic-Ipsic, S., What Makes Classification Trees Comprehensible?, Expert Syst. Appl., 2016, Vol. 62, pp. 333-346.
Martens, D., Vanthienen, J., Verbeke, W., and Baesens, B., Performance of Classification Models from a User Perspective, Decision Support Systems, 2011, Vol. 51, No. 4, pp. 782-793.
Murdoch, W.J., Singh, C., Kumbier, K., Abbasi-Asl, R., and Yu, B., Interpretable Machine Learning : Definitions, Methods, and Applications, arXiv e-prints, p. arXiv:1901.04592, 2019.
Norouzi, M., Collins, M., Johnson, M.A., Fleet, D.J., and Kohli, P., Efficient Non-Greedy Optimization of Decision Trees, Proceedings of the 28th International Conference on Neural Informsation Processing, 2015, pp. 1729-1737.
Pazzani, M.J., Mani, S., and Shankle, W.R., Acceptance of Rules Generated by Machine Learning Among Medical Experts, Methods of Information in Medicine, 2001, Vol. 40, No. 5, pp. 380-385.
Quinlan, J.R., C4.5 : Programs for Machine Learning, San Francisco, CA, USA : Morgan Kaufmann, 1993.
Quinlan, J.R., Induction of Decision Trees, Machine Learning, 1986, Vol. 1, No. 1, pp. 81-106.
Verbeke, W., Marteens, D., Mues, C., and Baesens, B., Building Comprehensible Customer Churn Prediction Models with Advanced Rule Induction Techniques, Expert Systems with Applications, 2011, Vol. 38, No. 3, pp. 2354-2364.
Wang, Y. and Xia, S.-T., Unifying Attribute Splitting Criteria of Decision Trees by Tsallis Entropy, in Proc. IEEE Int. Conf. Acoust., Speech Signal Process(ICASSP), 2017, pp. 2507-2511.
Wang, Y., Xia, S.-T., and Wu, J., A Less-Greedy Two-Term Tsallis Entropy Information Metric Approach for Decision Tree Classification, Knowledge-Based Systems, 2017, Vol. 120, pp. 34-42.
Zhao, Y. and Zhang, Y., Comparison of Decision Tree Methods for Finding Active Objects, Advances in Space Research, 2008, Vol. 41, No. 12, pp. 1955-1959.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
Free Access. 출판사/학술단체 등이 허락한 무료 공개 사이트를 통해 자유로운 이용이 가능한 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.