$\require{mediawiki-texvc}$

연합인증

연합인증 가입 기관의 연구자들은 소속기관의 인증정보(ID와 암호)를 이용해 다른 대학, 연구기관, 서비스 공급자의 다양한 온라인 자원과 연구 데이터를 이용할 수 있습니다.

이는 여행자가 자국에서 발행 받은 여권으로 세계 각국을 자유롭게 여행할 수 있는 것과 같습니다.

연합인증으로 이용이 가능한 서비스는 NTIS, DataON, Edison, Kafe, Webinar 등이 있습니다.

한번의 인증절차만으로 연합인증 가입 서비스에 추가 로그인 없이 이용이 가능합니다.

다만, 연합인증을 위해서는 최초 1회만 인증 절차가 필요합니다. (회원이 아닐 경우 회원 가입이 필요합니다.)

연합인증 절차는 다음과 같습니다.

최초이용시에는
ScienceON에 로그인 → 연합인증 서비스 접속 → 로그인 (본인 확인 또는 회원가입) → 서비스 이용

그 이후에는
ScienceON 로그인 → 연합인증 서비스 접속 → 서비스 이용

연합인증을 활용하시면 KISTI가 제공하는 다양한 서비스를 편리하게 이용하실 수 있습니다.

Can machine learning explain human learning?

Neurocomputing, v.192, 2016년, pp.14 - 28  

Vahdat, M. ,  Oneto, L. ,  Anguita, D. ,  Funk, M. ,  Rauterberg, M.

Abstract AI-Helper 아이콘AI-Helper

Learning Analytics (LA) has a major interest in exploring and understanding the learning process of humans and, for this purpose, benefits from both Cognitive Science, which studies how humans learn, and Machine Learning, which studies how algorithms learn from data. Usually, Machine Learning is exp...

주제어

참고문헌 (79)

  1. Int. J. Technol. Enhanc. Learn. Chatti 4 5 318 2012 10.1504/IJTEL.2012.051815 A reference model for learning analytics 

  2. J.I. Lee, E. Brunskill, The impact on individualizing student models on necessary practice opportunities, in: International Conference on Educational Data Mining, 2012. 

  3. M. Brown, Learning analytics: moving from concept to practice, in: EDUCAUSE Learning Initiative, 2012. 

  4. M. Vahdat, A. Ghio, L. Oneto, D. Anguita, M. Funk, M. Rauterberg, Advances in learning analytics and educational data mining, in: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, 2015. 

  5. Educ. Technol. Soc. Papamitsiou 17 4 49 2014 Learning analytics and educational data mining in practice 

  6. EDUCAUSE Rev. Siemens 46 5 30 2011 Penetrating the fog 

  7. M. Bienkowski, M. Feng, B. Means, Enhancing teaching and learning through educational data mining and learning analytics: an issue brief, US Department of Education, Office of Educational Technology, 2012, pp. 1-57. 

  8. AI Mag. Koedinger 34 3 27 2013 10.1609/aimag.v34i3.2484 New potentials for data-driven intelligent tutoring system development and optimization 

  9. C. Piech, J. Huang, Z. Chen, C. Do, A. Ng, D. Koller, Tuned models of peer assessment in moocs, in: arXiv preprint arXiv:1307.2579, 2013. 

  10. Int. J. Technol. Enhanc. Learn. Ferguson 4 5 304 2012 10.1504/IJTEL.2012.051816 Learning analytics 

  11. Polk 2002 Cognitive Modeling 

  12. Bishop 2006 Pattern Recognition and Machine Learning 

  13. Neurocomputing Lillo-Castellano 153 286 2015 10.1016/j.neucom.2014.11.026 Traffic sign segmentation and classification using statistical learning methods 

  14. Neurocomputing Yuan 159 227 2015 10.1016/j.neucom.2015.01.066 Image quality assessment 

  15. Neurocomputing Zhang 155 108 2015 10.1016/j.neucom.2014.12.042 Adaptive energy detection for bird sound detection in complex environments 

  16. 10.1016/j.neucom.2015.04.059 Y. Tian, Q. Ruan, G. An, W. Xu, Context and locality constrained linear coding for human action recognition, Neurocomputing (2016), 10.1016/j.neucom.2015.04.059, in press. 

  17. MacKay 2003 Information Theory, Inference and Learning Algorithms 

  18. Hastie 2009 Unsupervised Learning 

  19. Shawe-Taylor 2004 Kernel Methods for Pattern Analysis 

  20. J. Artif. Intell. Educ. Baker 11 122 2000 The roles of models in artificial intelligence and education research 

  21. Appl. Artif. Intell. Kotsiantis 18 5 411 2004 10.1080/08839510490442058 Predicting students׳ performance in distance learning using machine learning techniques 

  22. 10.1007/11527886_50 P. Brusilovsky, S. Sosnovsky, O. Shcherbinina, User modeling in a distributed e-learning architecture, in: User Modeling, 2005. 

  23. M. Rauterberg, S. Schluep, M. Fjeld, How to model behavioural and cognitive complexity in human-computer interaction with petri nets, in: International Workshop on Robot and Human Communication, 1997. 

  24. 10.1145/2330601.2330666 K.E. Arnold, M.D. Pistilli, Course signals at Purdue: using learning analytics to increase student success, in: International Conference on Learning Analytics and Knowledge, 2012. 

  25. Comput. Educ. Triantafillou 41 1 87 2003 10.1016/S0360-1315(03)00031-9 The design and the formative evaluation of an adaptive educational system based on cognitive styles 

  26. Vapnik 1998 Statistical Learning Theory 

  27. IEEE Trans. Neural Netw. Learn. Syst. Anguita 23 9 1390 2012 10.1109/TNNLS.2012.2202401 In-sample and out-of-sample model selection and error estimation for support vector machines 

  28. IEEE Trans. Inf. Theory Koltchinskii 47 5 1902 2001 10.1109/18.930926 Rademacher penalties and structural risk minimization 

  29. Ann. Stat. Bartlett 33 4 1497 2005 10.1214/009053605000000282 Local Rademacher complexities 

  30. Neurocomputing Anguita 74 9 1436 2011 10.1016/j.neucom.2010.12.009 Maximal discrepancy for support vector machines 

  31. 10.1145/279943.279989 D.A. McAllester, Some PAC-Bayesian theorems, in: Computational Learning Theory, 1998. 

  32. Theor. Comput. Sci. Lever 473 4 2013 10.1016/j.tcs.2012.10.013 Tighter PAC-Bayes bounds through distribution-dependent priors 

  33. 10.1109/TCYB.2014.2361857 L. Oneto, A. Ghio, S. Ridella, D. Anguita, Fully empirical and data-dependent stability-based bounds, IEEE Trans. Cybern. (2016), http://dx.doi.org/10.1109/TCYB.2014.2361857, in press. 

  34. Mach. Learn. Floyd 21 3 269 1995 10.1007/BF00993593 Sample compression, learnability, and the Vapnik-Chervonenkis dimension 

  35. J. Mach. Learn. Res. Bousquet 2 499 2002 Stability and generalization 

  36. Nature Poggio 428 6981 419 2004 10.1038/nature02341 General conditions for predictivity in learning theory 

  37. Bruner 1956 A Study of Thinking 

  38. Watanabe 1985 Pattern Recognition: Human and Mechanical 

  39. J. Exp. Psychol.: Learn. Memory Cogn. Pashler 39 4 1162 2013 When does fading enhance perceptual category learning? 

  40. 10.1007/978-0-387-34870-4_7 M. Rauterberg, About a framework for information and information processing of learning systems, in: ISCO, 1995. 

  41. M. Rauterberg, E. Ulich, Information processing for learning systems: an action theoretical approach, in: IEEE International Conference on Systems, Man, and Cybernetics, 1996. 

  42. Cogn. Sci. Goodman 32 1 108 2008 10.1080/03640210701802071 A rational analysis of rule-based concept learning 

  43. D. Vats, C. Studer, A.S. Lan, L. Carin, R. Baraniuk, Test-size reduction for concept estimation, in: International Conference on Educational Data Mining, 2013. 

  44. Psychol. Rev. Kruschke 99 1 22 1992 10.1037/0033-295X.99.1.22 Alcove 

  45. Psychol. Rev. Medin 85 3 207 1978 10.1037/0033-295X.85.3.207 Context theory of classification learning 

  46. Psychon. Bull. Rev. Nosofsky 5 3 345 1998 10.3758/BF03208813 A rule-plus-exception model for classifying objects in continuous-dimension spaces 

  47. Murphy 2002 The Big Book of Concepts 

  48. 10.1002/0471264385.wei0422 R.L. Goldstone, A. Kersten, Concepts and categorization, in: Handbook of Psychology, 2003. 

  49. Neurocomputing Deák 70 13 2139 2007 10.1016/j.neucom.2006.06.008 New trends in cognitive science 

  50. Neurocomputing Madani 74 8 1213 2011 10.1016/j.neucom.2010.07.021 Multi-level cognitive machine-learning based concept for human-like artificial walking 

  51. Neurocomputing Matsuka 71 13 2446 2008 10.1016/j.neucom.2007.12.039 Toward a descriptive cognitive model of human learning 

  52. T. Joachims, Learning representations of student knowledge and educational content, in: International Conference on Machine Learning Workshop-Machine Learning for Education, 2015. 

  53. 10.1145/2157136.2157182 C. Piech, M. Sahami, D. Koller, S. Cooper, P. Blikstein, Modeling how students learn to program, in: ACM Technical Symposium on Computer Science Education, 2012. 

  54. J. Mach. Learn. Res. Lan 15 1 1959 2014 Sparse factor analysis for learning and content analytics 

  55. Cogn. Sci. Griffiths 32 1 68 2008 10.1080/03640210701801974 Using category structures to test iterated learning as a method for identifying inductive biases 

  56. Nature Feldman 407 6804 630 2000 10.1038/35036586 Minimization of boolean complexity in human concept learning 

  57. X. Zhu, B.R. Gibson, T.T. Rogers, Human Rademacher complexity, in: Neural Information Processing Systems, 2009. 

  58. M. Vahdat, L. Oneto, A. Ghio, D. Anguita, D. Funk, M. Rauterberg, Human algorithmic stability and human Rademacher complexity, in: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, 2015. 

  59. J. Mach. Learn. Res. Bartlett 3 463 2003 Rademacher and Gaussian complexities 

  60. Pattern Recognit. Lett. Klesk 32 14 1882 2011 10.1016/j.patrec.2011.07.012 Sets of approximating functions with finite Vapnik-Chervonenkis dimension for nearest-neighbors algorithms 

  61. Schacter 2010 Psychology 

  62. 10.1609/aaai.v29i1.9761 X. Zhu, Machine teaching: an inverse problem to machine learning and an approach toward optimal education, in: AAAI Conference on Artificial Intelligence (Senior Member Track), 2015. 

  63. Mach. Learn. Bartlett 48 1-3 85 2002 10.1023/A:1013999503812 Model selection and error estimation 

  64. 10.1007/s11063-015-9429-2 L. Oneto, A. Ghio, S. Ridella, D. Anguita, Global Rademacher complexity bounds: from slow to fast convergence rates, Neural Process. Lett. (2016), http://dx.doi.org/10.1007/s11063-015-9429-2, in press. 

  65. L. Oneto, A. Ghio, S. Ridella, D. Anguita, Learning resource-aware classifiers for mobile devices: from regularization to energy efficiency, Neurocomputing (2016), 10.1016/j.neucom.2014.12.099, in press. 

  66. Adv. Comput. Math. Mukherjee 25 1 161 2006 10.1007/s10444-004-7634-z Learning theory 

  67. Surv. Comb. McDiarmid 141 1 148 1989 On the method of bounded differences 

  68. G. Casella, R.L. Berger, Statistical Inference, vol. 2, Duxbury Pacific Grove, CA, 2002. 

  69. Devroye 1996 A Probabilistic Theory of Pattern Recognition 

  70. IEEE Trans. Inf. Theory Steinwart 51 1 128 2005 10.1109/TIT.2004.839514 Consistency of support vector machines and other regularized kernel classifiers 

  71. Phys. Rev. Lett. Dietrich 82 14 2975 1999 10.1103/PhysRevLett.82.2975 Statistical mechanics of support vector networks 

  72. J. Phys. A: Math. Gen. Opper 23 11 L581 1990 10.1088/0305-4470/23/11/012 On the ability of the optimal perceptron to generalise 

  73. 10.1007/978-1-4612-0723-8_5 M. Opper, Statistical mechanics of learning: generalization, in: The Handbook of Brain Theory and Neural Networks, 1995. 

  74. J. Comput. Biol. Mukherjee 10 2 119 2003 10.1089/106652703321825928 Estimating dataset size requirements for classifying dna microarray data 

  75. J. Am. Stat. Assoc. Hoeffding 58 301 13 1963 10.1080/01621459.1963.10500830 Probability inequalities for sums of bounded random variables 

  76. D. Anguita, A. Ghio, L. Oneto, S. Ridella, Maximal discrepancy vs. Rademacher complexity for error estimation, in: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, 2011. 

  77. Neural Netw. Oneto 65 115 2015 10.1016/j.neunet.2015.02.006 Local Rademacher complexity 

  78. D.A. Medler, A. Arnoldussen, J.R. Binder, M.S. Seidenberg, The Wisconsin perceptual attribute ratings database, 〈http://www.neuro.mcw.edu/ratings/ 〉, 2005. 

  79. Trends Cogn. Sci. Chater 7 1 19 2003 10.1016/S1364-6613(02)00005-0 Simplicity 

관련 콘텐츠

저작권 관리 안내
섹션별 컨텐츠 바로가기

AI-Helper ※ AI-Helper는 오픈소스 모델을 사용합니다.

AI-Helper 아이콘
AI-Helper
안녕하세요, AI-Helper입니다. 좌측 "선택된 텍스트"에서 텍스트를 선택하여 요약, 번역, 용어설명을 실행하세요.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.

선택된 텍스트

맨위로