Methods of facial coding scoring for optimally identifying consumers' responses to arrive at effective, incisive, actionable conclusions
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G06K-009/00
G06K-009/46
G06K-009/66
출원번호
US-0856113
(2010-08-13)
등록번호
US-8326002
(2012-12-04)
발명자
/ 주소
Hill, Daniel A.
출원인 / 주소
Sensory Logic, Inc.
대리인 / 주소
Schwegman Lundberg & Woessner, P.A.
인용정보
피인용 횟수 :
17인용 특허 :
57
초록▼
The present disclosure relates to a method of assessing consumer reaction to a stimulus, comprising receiving a visual recording stored on a computer-readable medium of facial expressions of at least one human subject as the subject is exposed to a business stimulus so as to generate a chronological
The present disclosure relates to a method of assessing consumer reaction to a stimulus, comprising receiving a visual recording stored on a computer-readable medium of facial expressions of at least one human subject as the subject is exposed to a business stimulus so as to generate a chronological sequence of recorded facial images; accessing the computer-readable medium for automatically detecting and recording expressional repositioning of each of a plurality of selected facial features by conducting a computerized comparison of the facial position of each selected facial feature through sequential facial images; automatically coding contemporaneously detected and recorded expressional repositionings to at least a first action unit, wherein the action unit maps to a first set of one or more possible emotions expressed by the human subject; assigning a numerical weight to each of the one or more possible emotions of the first set based upon both the number of emotions in the set and the common emotions in at least a second set of one or more possible emotions related to at least one other second action unit observed within a predetermined time period.
대표청구항▼
1. A method of assessing consumer reaction to a stimulus, comprising: (a) receiving a visual recording stored on a computer-readable medium of facial expressions of at least one human subject as the subject is exposed to a stimulus so as to generate a chronological sequence of recorded facial images
1. A method of assessing consumer reaction to a stimulus, comprising: (a) receiving a visual recording stored on a computer-readable medium of facial expressions of at least one human subject as the subject is exposed to a stimulus so as to generate a chronological sequence of recorded facial images;(b) accessing the computer-readable medium for automatically detecting and recording expressional repositioning of each of a plurality of selected facial features by conducting a computerized comparison of the facial position of each selected facial feature through sequential facial images;(c) automatically coding contemporaneously detected and recorded expressional repositionings to at least a first action unit, wherein the action unit maps to a first set of one or more possible emotions expressed by the human subject;(d) assigning a numerical weight to each of the one or more possible emotions of the first set based upon the number of emotions in the set and common emotions correlating to at least a second set of one or more possible emotions related to at least one other second action unit detected within a predetermined time period of detecting the at least a first action unit. 2. The method of claim 1, wherein the numerical weight assigned to the common emotions between the one or more possible emotions of the first set and the at least second set is greater than non common emotions. 3. The method of claim 2, wherein the difference between the numerical weights assigned to the common emotions and the non common emotions is proportional to the temporal difference between detection of the first action unit and the second action unit. 4. The method of claim 1, wherein the numerical weight associated to the common emotions represents a probability that the human subject is expressing that particular emotion at the time the first action unit is coded. 5. The method of claim 1, wherein at least one of the first and second sets of one or more possible emotions comprises surprise. 6. The method of claim 5, wherein at least one of the first and second sets of one or more possible emotions comprises an emotion other than surprise. 7. The method of claim 6, further comprising assigning a valence of positive, negative, or a blend of positive and negative to the surprise emotion based upon the emotion other than surprise. 8. The method of claim 1, wherein at least one of the first and second sets of one or more possible emotions comprises happiness, and is associated with a type of smile. 9. The method of claim 8, wherein at least one of the first and second sets of one or more possible emotions comprises an emotion other than happiness. 10. The method of claim 9, further comprising modifying the type of smile associated with the happiness emotion based upon the emotion other than happiness. 11. The method of claim 1, further comprising identifying secondary emotions based upon the first action unit and the second action unit. 12. The method of claim 11, wherein identifying secondary emotions from the first and second action units comprises using the first and second action units as indices into a table that contains the secondary emotions. 13. The method of claim 1, further comprising calculating the duration of the first and second action units. 14. The method of claim 13, wherein the numerical weight assigned to the common emotions is based at least in part upon the duration of the first action unit. 15. The method of claim 1, further comprising calculating an engagement score, wherein the engagement score comprises a percentage of human subjects that exhibited at least one action unit during a defined time period. 16. The method of claim 15, further comprising calculating an impact score for each of the action units where the impact score relates to the intensity of the expressed action unit. 17. The method of claim 16, further comprising calculating a stopping power by multiplying the impact score by the engagement score. 18. The method of claim 17, further comprising displaying on an output screen at least one of: the impact score, the engagement score, and the stopping power. 19. The method of claim 17, further comprising calculating a persuasion score relating to the power of the stimulus to persuade the one or more human subjects to agree with a message conveyed by the stimulus, wherein the persuasion is calculated based, at least in part, upon the engagement and appeal score, wherein the appeal score relates to the valence of an emotion. 20. The method of claim 19, further comprising calculating a longevity score of the stimulus, relating to the resistance of a stimulus to consumer fatigue, and wherein the longevity score is based, at least in part, upon the engagement score, impact score, and appeal score. 21. The method of claim 1, further comprising displaying on an output screen the numerical weight assigned to the emotions of at least one action unit. 22. A method of assessing consumer reaction to a stimulus, comprising: (a) receiving a visual recording stored on a computer-readable medium of facial expressions of at least one human subject as the subject is exposed to a stimulus so as to generate a chronological sequence of recorded facial images;(b) accessing the computer-readable medium for automatically detecting and recording expressional repositioning of each of a plurality of selected facial features by conducting a computerized comparison of the facial position of each selected facial feature through sequential facial images;(c) automatically coding contemporaneously detected and recorded expressional repositionings to at least one action unit;(d) assigning a weight to one or more possible emotions associated with the at least one action unit based upon at least one second action unit. 23. The method of claim 22, wherein the second action unit relates to expressional repositionings detected within a predetermined time period of detecting the at least one action unit.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (57)
Kang, Dong-joong; Ryoo, Sung-gul; Kim, Ji-yeun; Kim, Chang-yeong; Seo, Yang-seock, Apparatus and method for detecting speaking person's eyes and face.
Miyata Takashi,JPX ; Matsuo Noriyoshi,JPX ; Uchida Hitoshi,JPX ; Sawada Naomi,JPX ; Tomita Yutaka,JPX, Apparatus and system for measuring electrical potential variations in human body.
Petajan Eric D. (25 Cypress St. Millburn NJ 07041), Electronic facial tracking and detection system and method and apparatus for automated speech recognition.
Henry Von Kohorn, Evaluation of responses of participatory broadcast audience with prediction of winning contestants; monitoring, checking and controlling of wagering, and automatic crediting and couponing.
Fedorovskaya,Elena A.; Endrikhovski,Serguei; Parulski,Kenneth A.; Zacks,Carolyn A.; Taxier,Karen M.; Telek,Michael J.; Marino,Frank; Harel,Dan, Imaging method and system for health monitoring and personal security.
Blazey Richard N. ; Miller Paige ; Fedorovskaya Elena A. ; Prabhu Girish V. ; Parks Peter A. ; Patton David L. ; Fredlund John R. ; Horwitz Cecelia M. ; Mir Jose M., Management of physiological and psychological state of an individual using images biometric analyzer.
Silverstein Fred E. (Seattle WA) Martin Roy W. (Redmond WA) Kimmey Michael B. (Seattle WA) Schuffler Michael D. (Mercer Island WA) Proctor Andrew H. (Duvall WA) Jiranek Geoffrey C. (Seattle WA), Method and apparatus for determining the motility of a region in the human body.
Oda Masaomi,JPX ; Chandrasiri Naiwala Pathirannehelage,JPX, Method and apparatus of facial image conversion by interpolation/extrapolation for plurality of facial expression components representing facial image.
Borah Joshua D. (Mansfield MA) Merriam James C. (West Newton MA) Velez Jose (Brookline MA), Method and system for generating a synchronous display of a visual presentation and the looking response of many viewers.
Dryer,D. Christopher; Flickner,Myron Dale; Mao,Jianchang, Method and system for real-time determination of a subject's interest level to media content.
Endrikhovski,Serguei; Fedorovskaya,Elena A.; Matraszek,Tomasz A.; Parulski,Kenneth A.; Mir,Jose M., Method for using facial expression to determine affective information in an imaging system.
Weathers Lawrence R. (West 1525 - 8th Ave. Spokane WA 99204), Psychotherapy apparatus and method for treating undesirable emotional arousal of a patient.
Bortolussi Jay F. ; Cusack ; Jr. Francis J. ; Ehn Dennis C. ; Kuzeja Thomas M. ; Saulnier Michael S., Real-time facial recognition and verification system.
Bortolussi, Jay F.; Cusack, Jr., Francis J.; Ehn, Dennis C.; Kuzeja, Thomas M.; Saulnier, Michael S., Real-time facial recognition and verification system.
Zealear David L. (1061 W. Hollywood #2A Chicago IL 60660) Gibson Alan R. (Phoenix AZ), System and method for evaluating neurological function controlling muscular movements.
Bayer Leonard Robert ; Mott John Jason ; Radielovic Albina ; Beer Frederick Anton Eilers, System for conducting surveys in different languages over a network with survey voter registration.
Lee, Hans C.; Hong, Timmie T.; Williams, William H.; Fettiplace, Michael R., Method and system for using coherence of biological responses as a measure of performance of a media.
Levine, Brian; Marci, Carl; Kothuri, Ravi Kanth V., System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications.
Miller, Christopher A.; Wu, Peggy; Rye, Jeffrey M.; Funk, Harry B.; Ott, Tammy Elizabeth; Schmer-Galunder, Sonja Maria, Systems and methods for determining social perception.
Miller, Christopher A.; Wu, Peggy; Rye, Jeffrey M.; Funk, Harry B.; Ott, Tammy Elizabeth; Schmer-Galunder, Sonja Maria, Systems and methods for determining social perception.
Miller, Christopher A.; Wu, Peggy; Rye, Jeffrey M.; Funk, Harry B.; Ott, Tammy Elizabeth; Schmer-Galunder, Sonja Maria, Systems and methods for determining social perception.
Miller, Christopher A.; Wu, Peggy; Rye, Jeffrey M.; Funk, Harry B.; Ott, Tammy Elizabeth; Schmer-Galunder, Sonja Maria, Systems and methods for determining social perception scores.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.