$\require{mediawiki-texvc}$

연합인증

연합인증 가입 기관의 연구자들은 소속기관의 인증정보(ID와 암호)를 이용해 다른 대학, 연구기관, 서비스 공급자의 다양한 온라인 자원과 연구 데이터를 이용할 수 있습니다.

이는 여행자가 자국에서 발행 받은 여권으로 세계 각국을 자유롭게 여행할 수 있는 것과 같습니다.

연합인증으로 이용이 가능한 서비스는 NTIS, DataON, Edison, Kafe, Webinar 등이 있습니다.

한번의 인증절차만으로 연합인증 가입 서비스에 추가 로그인 없이 이용이 가능합니다.

다만, 연합인증을 위해서는 최초 1회만 인증 절차가 필요합니다. (회원이 아닐 경우 회원 가입이 필요합니다.)

연합인증 절차는 다음과 같습니다.

최초이용시에는
ScienceON에 로그인 → 연합인증 서비스 접속 → 로그인 (본인 확인 또는 회원가입) → 서비스 이용

그 이후에는
ScienceON 로그인 → 연합인증 서비스 접속 → 서비스 이용

연합인증을 활용하시면 KISTI가 제공하는 다양한 서비스를 편리하게 이용하실 수 있습니다.

[해외논문] K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations 원문보기

Scientific data, v.7 = v.7 no.1, 2020년, pp.293 -   

Park, Cheul Young (Korea Advanced Institute of Science and Technology, Graduate School of Knowledge Service Engineering, Daejeon, 34141 South Korea) ,  Cha, Narae (Korea Advanced Institute of Science and Technology, Graduate School of Knowledge Service Engineering, Daejeon, 34141 South Korea) ,  Kang, Soowon (Korea Advanced Institute of Science and Technology, Graduate School of Knowledge Service Engineering, Daejeon, 34141 South Korea) ,  Kim, Auk (Korea Advanced Institute of Science and Technology, Graduate School of Knowledge Service Engineering, Daejeon, 34141 South Korea) ,  Khandoker, Ahsan Habib (Khalifa University of Science and Technology, Department of Biomedical Engineering, Abu Dhabi, 127788 United Arab Emirates) ,  Hadjileontiadis, Leontios (Khalifa University of Science and Technology, Department of Biomedical Engineering, Abu Dhabi, 127788 United Arab Emirates) ,  Oh, Alice (Korea Advanced Institute of Science and Technology, School of Computing, Daejeon, 34141 South Korea) ,  Jeong, Yong (Korea Advanced Institute of Science and Technology, Department of Bio and Brain Engineering, Daejeon, 34141 South Korea) ,  Lee, Uichin (Ko)

Abstract AI-Helper 아이콘AI-Helper

Recognizing emotions during social interactions has many potential applications with the popularization of low-cost mobile sensors, but a challenge remains with the lack of naturalistic affective interaction data. Most existing emotion datasets do not support studying idiosyncratic emotions arising ...

참고문헌 (103)

  1. 1. Salovey P Mayer JD Emotional intelligence Imagination, Cogn. Pers. 1990 9 185 211 

  2. 2. Mayer JD Caruso DR Salovey P Emotional intelligence meets traditional standards for an intelligence Intell. 1999 27 267 298 

  3. 3. Salovey, P. E. & Sluyter, D. J. Emotional development and emotional intelligence: educational implications . (Basic Books, 1997). 

  4. 4. Lopes PN Emotional intelligence and social interaction Pers. Soc. Psychol. Bull. 2004 30 1018 1034 15257786 

  5. 5. Esteva A Dermatologist-level classification of skin cancer with deep neural networks Nat. 2017 542 115 118 

  6. 6. Mastoras R-E Touchscreen typing pattern analysis for remote detection of the depressive tendency Sci. Reports 2019 9 1 12 

  7. 7. Yurtsever E Lambert J Carballo A Takeda K A survey of autonomous driving: common practices and emerging technologies IEEE Access 2020 8 58443 58469 

  8. 8. Pennachin, C. & Goertzel, B. Contemporary approaches to artificial general intelligence. In Artificial General Intelligence , 1?30 (Springer, 2007). 

  9. 9. Silver D Mastering the game of go with deep neural networks and tree search Nat. 2016 529 484 

  10. 10. Silver D Mastering the game of go without human knowledge Nat. 2017 550 354 359 

  11. 11. Reeves, B. & Nass, C. I. The media equation: how people treat computers, television, and new media like real people and places . (Cambridge University Press, 1996). 

  12. 12. Turpen, A. Mit wants self-driving cars to traffic in human emotion. New Atlas , https://newatlas.com/automotive/mit-self-driving-cars-human-emotion/ (2019). 

  13. 13. Barrett, L. F. How emotions are made: the secret life of the brain (Houghton Mifflin Harcourt, 2017). 

  14. 14. Du S Tao Y Martinez AM Compound facial expressions of emotion Proc. Natl. Acad. Sci. 2014 111 E1454 E1462 24706770 

  15. 15. Yannakakis, G. N., Cowie, R. & Busso, C. The ordinal nature of emotions. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII) , 248?255 (IEEE, 2017). 

  16. 16. Frank, M. G. & Svetieva, E. Microexpressions and deception. In Understanding Facial Expressions in Communication , 227?242 (Springer, 2015). 

  17. 17. Barrett LF Adolphs R Marsella S Martinez AM Pollak SD Emotional expressions reconsidered: challenges to inferring emotion from human facial movements Psychol. Sci. Public Interest 2019 20 1 68 31313636 

  18. 18. Carroll JM Russell JA Do facial expressions signal specific emotions? judging emotion from the face in context J. Pers. Soc. Psychol. 1996 70 205 8636880 

  19. 19. Cauldwell, R. T. Where did the anger go? the role of context in interpreting emotion in speech. In ISCA Tutorial and Research Workshop (ITRW) on Speech and Emotion (2000). 

  20. 20. Barrett LF Mesquita B Gendron M Context in emotion perception Curr. Dir. Psychol. Sci. 2011 20 286 290 

  21. 21. Larsen RJ Diener E Affect intensity as an individual difference characteristic: a review J. Res. Pers. 1987 21 1 39 

  22. 22. Gross JJ John OP Individual differences in two emotion regulation processes: implications for affect, relationships, and well-being J. Pers. Soc. Psychol. 2003 85 348 12916575 

  23. 23. Soleymani M Lichtenauer J Pun T Pantic M A multimodal database for affect recognition and implicit tagging. IEEE Transactions on Affect Comput. 2011 3 42 55 

  24. 24. Koelstra S Deap: a database for emotion analysis; using physiological signals. IEEE Transactions on Affect Comput. 2011 3 18 31 

  25. 25. Abadi MK Decaf: meg-based multimodal database for decoding affective physiological responses IEEE Transactions on Affect. Comput. 2015 6 209 222 

  26. 26. Subramanian R Ascertain: emotion and personality recognition using commercial sensors. IEEE Transactions on Affect Comput. 2016 9 147 160 

  27. 27. Katsigiannis S Ramzan N Dreamer: a database for emotion recognition through eeg and ecg signals from wireless low-cost off-the-shelf devices IEEE J. Biomed. Heal. Informatics 2017 22 98 107 

  28. 28. Correa, J. A. M., Abadi, M. K., Sebe, N. & Patras, I. Amigos: a dataset for affect, personality and mood research on individuals and groups. IEEE Transactions on Affect. Comput ., 10.1109/TAFFC.2018.2884461 (2018). 

  29. 29. Sharma K Castellini C van den Broek EL Albu-Schaeffer A Schwenker F A dataset of continuous affect annotations and physiological signals for emotion analysis Sci. Data 2019 6 1 13 30647409 

  30. 30. Yan,W.-J.,Wu, Q., Liu, Y.-J.,Wang, S.-J. & Fu, X. Casme database: a dataset of spontaneous micro-expressions collected from neutralized faces. In 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) , 1?7 (IEEE, 2013). 

  31. 31. Schmidt, P., Reiss, A., Duerichen, R., Marberger, C. & Van Laerhoven, K. Introducing wesad, a multimodal dataset for wearable stress and affect detection. In Proceedings of the 20th ACM International Conference on Multimodal Interaction , 400?408 (2018). 

  32. 32. Watson, D. Mood and temperament (Guilford Press, 2000). 

  33. 33. Batliner A Fischer K Huber R Spilker J Noth E How to find trouble in communication Speech Commun. 2003 40 117 143 

  34. 34. Henrich J Heine SJ Norenzayan A The weirdest people in the world? Behav. Brain Sci. 2010 33 61 83 20550733 

  35. 35. Dhall, A., Goecke, R., Lucey, S. & Gedeon, T. Collecting large, richly annotated facial-expression databases from movies. IEEE Multimed . 34?41 (2012). 

  36. 36. Mollahosseini A Hasani B Mahoor MH Affectnet: a database for facial expression, valence, and arousal computing in the wild IEEE Transactions on Affect. Comput. 2017 10 18 31 

  37. 37. McDuff D Amr M El Kaliouby R Am-fed+: an extended dataset of naturalistic facial expressions collected in everyday settings IEEE Transactions on Affect. Comput. 2018 10 7 17 

  38. 38. Poria, S. et al . Meld: a multimodal multi-party dataset for emotion recognition in conversations. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics , 527?536 (2019). 

  39. 39. McDuff D El Kaliouby R Picard RW Crowdsourcing facial responses to online videos. IEEE Transactions on Affect Comput. 2012 3 456 468 

  40. 40. Morris, R., McDuff, D. & Calvo, R. Crowdsourcing techniques for affective computing. In The Oxford Handbook of Affective Computing , 384?394 (Oxford Univ. Press, 2014). 

  41. 41. Korovina O Baez M Casati F Reliability of crowdsourcing as a method for collecting emotions labels on pictures BMC Res. Notes 2019 12 1 6 30602384 

  42. 42. Motley MT Camden CT Facial expression of emotion: a comparison of posed expressions versus spontaneous expressions in an interpersonal communication setting West. J. Speech Commun. 1988 52 1 22 

  43. 43. Jurgens R Grass A Drolet M Fischer J Effect of acting experience on emotion expression and recognition in voice: non-actors provide better stimuli than expected J. Nonverbal Behav. 2015 39 195 214 26246649 

  44. 44. Juslin PN Laukka P Banziger T The mirror to our soul? comparisons of spontaneous and posed vocal expression of emotion J. Nonverbal Behav. 2018 42 1 40 29497220 

  45. 45. Cacioppo JT The psychophysiology of emotion Handb. Emot. 2000 2 173 191 

  46. 46. Picard RW Vyzas E Healey J Toward machine emotional intelligence: analysis of affective physiological state IEEE Transactions on Pattern Analysis Mach. Intell. 2001 23 1175 1191 

  47. 47. Lisetti CL Nasoz F Using noninvasive wearable computers to recognize human emotions from physiological signals EURASIP J. on Adv. Signal Process. 2004 2004 929414 

  48. 48. Rainville P Bechara A Naqvi N Damasio AR Basic emotions are associated with distinct patterns of cardiorespiratory activity Int. J. Psychophysiol. 2006 61 5 18 16439033 

  49. 49. Nummenmaa L Glerean E Hari R Hietanen JK Bodily maps of emotions Proc. Natl. Acad. Sci. 2014 111 646 651 24379370 

  50. 50. Pace-Schott EF Physiological feelings Neurosci. & Biobehav. Rev. 2019 103 267 304 31125635 

  51. 51. Busso C Iemocap: interactive emotional dyadic motion capture database Lang. Resour. Eval. 2008 42 335 

  52. 52. McKeown G Valstar M Cowie R Pantic M Schroder M The semaine database: annotated multimodal records of emotionally colored conversations between a person and a limited agent. IEEE Transactions on Affect Comput. 2011 3 5 17 

  53. 53. Busso C Msp-improv: an acted corpus of dyadic interactions to study emotion perception. IEEE Transactions on Affect Comput. 2016 8 67 80 

  54. 54. Healey, J. Recording affect in the field: towards methods and metrics for improving ground truth labels. In Affective Computing and Intelligent Interaction , 107?116 (Springer, 2011). 

  55. 55. Zhang, B., Essl, G. & Mower Provost, E. Automatic recognition of self-reported and perceived emotion: does joint modeling help? In Proceedings of the 18th ACM International Conference on Multimodal Interaction , 217?224 (2016). 

  56. 56. Truong, K. P., van Leeuwen, D. A. & Neerincx, M. A. Unobtrusive multimodal emotion detection in adaptive interfaces: speech and facial expressions. In International Conference on Foundations of Augmented Cognition , 354?363 (Springer, 2007). 

  57. 57. Grossman JB Klin A Carter AS Volkmar FR Verbal bias in recognition of facial emotions in children with asperger syndrome The J. Child Psychol. Psychiatry Allied Discip. 2000 41 369 379 

  58. 58. Dickson H Calkins ME Kohler CG Hodgins S Laurens KR Misperceptions of facial emotions among youth aged 9?14 years who present multiple antecedents of schizophrenia Schizophr. Bull. 2014 40 460 468 23378011 

  59. 59. Truong KP Van Leeuwen DA De Jong FM Speech-based recognition of self-reported and observed emotion in a dimensional space Speech Commun. 2012 54 1049 1063 

  60. 60. Hess U Blairy S Kleck RE The intensity of emotional facial expressions and decoding accuracy J. Nonverbal Behav. 1997 21 241 257 

  61. 61. Ranganathan, H., Chakraborty, S. & Panchanathan, S. Multimodal emotion recognition using deep learning architectures. In 2016 IEEE Winter Conference on Applications of Computer Vision (WACV) , 1?9 (IEEE, 2016). 

  62. 62. Min, H. C. & Nam, T.-J. Biosignal sharing for affective connectedness. In CHI ’14 Extended Abstracts on Human Factors in Computing Systems , 2191?2196 (2014). 

  63. 63. Hassib, M., Buschek, D., Wozniak, P. W. & Alt, F. Heartchat: heart rate augmented mobile chat to support empathy and awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems , 2239?2251 (2017). 

  64. 64. Liu F Dabbish L Kaufman G Supporting social interactions with an expressive heart rate sharing application Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2017 1 1 26 

  65. 65. Liu F Animo: sharing biosignals on a smartwatch for lightweight social connection Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2019 3 1 19 

  66. 66. Liu F Kaufman G Dabbish L The effect of expressive biosignals on empathy and closeness for a stigmatized group member Proc. ACM on Human-Computer Interact. 2019 3 1 17 

  67. 67. Kim, S. South korea’s refugee debate eclipses a deeper, more fundamental question. The Hill , https://thehill.com/opinion/international/395977-south-koreas-refugee-debate-eclipses-a-deeper-more-fundamental-question (2018). 

  68. 68. Kang, J.-k. Yemeni refugees become a major issue on jeju. Korea JoongAng Daily , http://koreajoongangdaily.joins.com/news/article/article.aspx?aid=3049562 (2018). 

  69. 69. Park, N. South korea is going crazy over a handful of refugees. Foreign Policy , https://foreignpolicy.com/2018/08/06/south-korea-is-going-crazy-over-a-handful-of-refugees/ (2018). 

  70. 70. Seo, B. In south korea, opposition to yemeni refugees is a cry for help. CNN , https://edition.cnn.com/2018/09/13/opinions/south-korea-jeju-yemenis-intl/index.html (2018). 

  71. 71. Diers K Weber F Brocke B Strobel A Schonfeld S Instructions matter: a comparison of baseline conditions for cognitive emotion regulation paradigms Front. Psychol. 2014 5 347 24808872 

  72. 72. Gross JJ Levenson RW Emotion elicitation using films Cogn. Emot. 1995 9 87 108 

  73. 73. Kemper S Sumner A The structure of verbal abilities in young and older adults Psychol. Aging 2001 16 312 11405318 

  74. 74. Yuan, J., Liberman, M. & Cieri, C. Towards an integrated understanding of speaking rate in conversation. In Ninth International Conference on Spoken Language Processing (2006). 

  75. 75. Gabig, C. S. Mean length of utterance (mlu). Encycl. Autism Spectr. Disord . 1813?1814 (2013). 

  76. 76. Graesser, A. & Chipman, P. Detection of emotions during learning with autotutor. In Proceedings of the 28th Annual Meetings of the Cognitive Science Society , 285?290 (Erlbaum, 2006). 

  77. 77. Afzal, S. & Robinson, P. Natural affect data - collection annotation in a learning context. In 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops , 1?7 (IEEE, 2009). 

  78. 78. D’Mello SK Lehman B Person N Monitoring affect states during effortful problem solving activities Int. J. Artif. Intell. Educ. 2010 20 361 389 

  79. 79. D’Mello SK On the influence of an iterative affect annotation approach on inter-observer and self-observer reliability IEEE Transactions on Affect. Comput. 2015 7 136 149 

  80. 80. Levine LJ Safer MA Sources of bias in memory for emotions Curr. Dir. Psychol. Sci. 2002 11 169 173 

  81. 81. Safer MA Levine LJ Drapalski AL Distortion in memory for emotions: the contributions of personality and post-event knowledge Pers. Soc. Psychol. Bull. 2002 28 1495 1507 

  82. 82. Lench HC Levine LJ Motivational biases in memory for emotions Cogn. Emot. 2010 24 401 418 

  83. 83. Park CY 2020 K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations Zenodo 10.5281/zenodo.3931963 

  84. 84. Calix RA Mallepudi SA Chen B Knapp GM Emotion recognition in text for 3-d facial expression rendering IEEE Transactions on Multimed. 2010 12 544 551 

  85. 85. Wang, W., Chen, L., Thirunarayan, K. & Sheth, A. P. Harnessing twitter “big data” for automatic emotion identification. In 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Conferenece on Social Computing , 587?592 (IEEE, 2012). 

  86. 86. Xu R Word embedding composition for data imbalances in sentiment and emotion classification Cogn. Comput. 2015 7 226 240 

  87. 87. Krippendorff, K. Computing krippendorff’s alpha-reliability. Retrieved from, https://repository.upenn.edu/asc_papers/43 (2011). 

  88. 88. Lee U Intelligent positive computing with mobile, wearable, and iot devices: literature review and research directions Ad Hoc Networks 2019 83 8 24 

  89. 89. Picard RW Future affective technology for autism and emotion communication Philos. Transactions Royal Soc. B: Biol. Sci. 2009 364 3575 3584 

  90. 90. Washington P Superpowerglass: a wearable aid for the at-home therapy of children with autism Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2017 1 1 22 

  91. 91. Buimer, H. P. et al . Conveying facial expressions to blind and visually impaired persons through a wearable vibrotactile device. Plos One 13 (2018). 

  92. 92. Cha, N. et al . “Hello there! is now a good time to talk?”: understanding opportune moments for proactive conversational interaction with smart speakers. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol . 4 (2020). 

  93. 93. Kim A Park J-M Lee U Interruptibility for in-vehicle multitasking: influence of voice task demands and adaptive behaviors Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020 4 1 22 

  94. 94. Warnock-Parkes E Seeing is believing: using video feedback in cognitive therapy for social anxiety disorder Cogn. Behav. Pract. 2017 24 245 255 29033532 

  95. 95. Breazeal C Emotion and sociable humanoid robots Int. J. Human-Computer Stud. 2003 59 119 155 

  96. 96. Kwon, D.-S. et al . Emotion interaction system for a service robot. In RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication , 351?356 (IEEE, 2007). 

  97. 97. Nass, C. et al . Improving automotive safety by pairing driver emotion and car voice emotion. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems , 1973?1976 (2005). 

  98. 98. Eyben, F. et al . Emotion on the road―necessity, acceptance, and feasibility of affective computing in the car. Adv. Human-Computer Interact . 2010 (2010). 

  99. 99. Craig AD How do you feel? interoception: the sense of the physiological condition of the body Nat. Rev. Neurosci. 2002 3 655 666 12154366 

  100. 100. Markova, V., Ganchev, T. & Kalinkov, K. Clas: a database for cognitive load, affect and stress recognition. In 2019 International Conference on Biomedical Innovations and Applications (BIA) , 1?4 (IEEE, 2019). 

  101. 101. Russell JA A circumplex model of affect J. Pers. Soc. Psychol. 1980 39 1161 

  102. 102. Plarre, K. et al . Continuous inference of psychological stress from sensory measurements collected in the natural environment. In Proceedings of the 10th ACM/IEEE International Conference on Information Processing in Sensor Networks , 97?108 (IEEE, 2011). 

  103. 103. Ocumpaugh, J. Baker rodrigo ocumpaugh monitoring protocol (bromp) 2.0 technical and training manual. New York, NY Manila, Philipp. Teach. Coll. Columbia Univ. Ateneo Lab. for Learn. Sci . 60 (2015). 

LOADING...

활용도 분석정보

상세보기
다운로드
내보내기

활용도 Top5 논문

해당 논문의 주제분야에서 활용도가 높은 상위 5개 콘텐츠를 보여줍니다.
더보기 버튼을 클릭하시면 더 많은 관련자료를 살펴볼 수 있습니다.

관련 콘텐츠

오픈액세스(OA) 유형

GOLD

오픈액세스 학술지에 출판된 논문

유발과제정보 저작권 관리 안내
섹션별 컨텐츠 바로가기

AI-Helper ※ AI-Helper는 오픈소스 모델을 사용합니다.

AI-Helper 아이콘
AI-Helper
안녕하세요, AI-Helper입니다. 좌측 "선택된 텍스트"에서 텍스트를 선택하여 요약, 번역, 용어설명을 실행하세요.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.

선택된 텍스트

맨위로