An information processing apparatus includes a face detecting unit that detects a face area included in image data, a face-component detecting unit that detects a face component from the face area detected by the face detecting unit, and a line-of-sight discriminating unit that executes line-of-sigh
An information processing apparatus includes a face detecting unit that detects a face area included in image data, a face-component detecting unit that detects a face component from the face area detected by the face detecting unit, and a line-of-sight discriminating unit that executes line-of-sight discrimination processing for a face image from which the face component is detected by the face-component detecting unit. The line-of-sight discriminating unit executes processing for discriminating whether a line of sight of the face image data from which the face component is detected is in a positive state in which a line of sight is directed in a camera direction or a negative state in which a line of sight is not directed in a camera direction according to collation processing for a line-of-sight discrimination dictionary in which learning data including classification data corresponding to the respective states are stored and input face image data.
대표청구항▼
1. An information processing apparatus, comprising: a face detecting unit, implemented on a hardware processor of the information processing apparatus, that detects a face area included in image data;a face-component detecting unit that detects a face component from the face area detected by the fac
1. An information processing apparatus, comprising: a face detecting unit, implemented on a hardware processor of the information processing apparatus, that detects a face area included in image data;a face-component detecting unit that detects a face component from the face area detected by the face detecting unit; anda line-of-sight discriminating unit that executes line-of-sight discrimination processing for a face image from which the face component is detected, whereinthe line-of-sight discriminating unit executes processing for discriminating a state of a line of sight of face image data of the face image from which the face component is detected, the state being one of a positive state in which the line of sight is directed in a camera direction and a negative state in which the line of sight is not directed in the camera direction, the processing being based on collation processing of the face image data with respect to a line-of-sight discrimination dictionary in which learning data, including a plurality of face images classified in the positive state and a plurality of face images classified in the negative state, is stored, anda warning output unit outputs the warning signal when a face image classified in the negative state is included in a line of sight of a face included in an acquired image of the imaging apparatus. 2. The information processing apparatus according to claim 1, wherein the face-component detecting unit detects eyes, a nose, and a mouth from the face area detected by the face detecting unit, and the line-of-sight discriminating unit executes the line-of-sight discrimination processing for the face image data from which the eyes, the nose, and the mouth are detected. 3. The information processing apparatus according to claim 1, wherein the line-of-sight discriminating unit executes the line-of-sight discrimination processing according to an algorithm that uses a difference value, of a pixel pair in a specific position of an image, as comparison and collation data. 4. The information processing apparatus according to claim 1, further comprising an image processing unit that executes rotation processing, size normalization processing, and face-area slicing processing for the face image from which the face component is detected by the face-component detecting unit, wherein the line-of-sight discriminating unit inputs the face image processed by the image processing unit and executes the line-of-sight discrimination processing. 5. The information processing apparatus according to claim 1, wherein the face detecting unit executes, with reference to a face detection dictionary in which different types of face area image information are stored, and face-area detection processing the face area included in the image data using an algorithm that uses a difference value, of a pixel pair in a specific position of an image, as comparison and collation data. 6. The information processing apparatus according to claim 1, wherein the face-component detecting unit executes, with reference to a face component detection dictionary in which different types of face component image information are stored, and face-component detection processing for detecting the face component from the face area detected by the face detecting unit using an algorithm that uses a difference value, of a pixel pair in a specific position of an image, as comparison and collation data. 7. The information processing apparatus according to claim 1, further comprising a sound recognizing unit that executes sound recognition processing, wherein the information processing apparatus combines a line-of-sight discrimination result in the line-of-sight discriminating unit and a sound recognition result in the sound recognizing unit and executes analysis of a speaker. 8. The information processing apparatus according to claim 1, wherein the information processing apparatus is an imaging apparatus,the information processing apparatus further includes a shutter control unit that is inputted with a line-of-sight discrimination result in the line-of-sight discriminating unit and executes shutter control for the imaging apparatus, andthe shutter control unit performs control for disabling a shutter operation when a face image classified in the negative state is included in a line of sight of a face included in an acquired image of the imaging apparatus. 9. The information processing apparatus according to claim 1, further comprising a frame selecting unit that is inputted with a line-of-sight discrimination result corresponding to plural image frames discriminated by the line-of-sight discriminating unit and performs selection processing for the image data, wherein the frame selecting unit performs selection of an image frame from the image data based on whether the line-of-sight discrimination result specifies the positive state or the negative state, for each of the plural image frames. 10. The information processing apparatus according to claim 1, wherein the information processing apparatus applies a line-of-sight discrimination result of the line-of-sight discriminating unit to data retrieval processing and executes retrieval processing for selecting and extracting an image that is in one of the positive state in which the line of sight is directed in the camera direction and the negative state in which the line of sight is not directed in the camera direction. 11. The information processing apparatus according to claim 1, wherein the information processing apparatus performs processing for storing a line-of-sight discrimination processing result of the image data in a storing unit as correspondence information, with the image data. 12. The information processing apparatus according to claim 1, wherein when the species of a subject in the image data is a cat, the plurality of face images classified in the positive state and the plurality of images classified in the negative state include face images of cats. 13. The information processing apparatus according to claim 1, wherein when the species of a subject in the image data is a dog, the plurality of face images classified in the positive state and the plurality of images classified in the negative state include face images of dogs. 14. The information processing apparatus according to claim 1, wherein the information processing apparatus is an imaging apparatus having an image capture function that is controlled based on a line-of-sight discrimination result. 15. The information processing apparatus according to claim 1, wherein the information processing apparatus is an imaging apparatus having an image capture function that is disabled based on a line-of-sight discrimination result. 16. The information processing apparatus according to claim 1, wherein the information processing apparatus is an imaging apparatus having an image capture function that is augmented with an output display based on a line-of-sight discrimination result. 17. The information processing apparatus according to claim 1, further comprising: a rotation-correction processing unit that performs image rotation correction using a positional relationship among eyes, a nose, and a mouth in the face image. 18. An information processing method, executed in an information processing apparatus, the information processing method comprising: detecting, by a hardware processor of the information processing apparatus, a face area included in image data;detecting, by the information processing apparatus, a face component from the face area detected in the face detecting step;executing, by the information processing apparatus, line-of-sight discrimination processing for a face image from which the face component is detected in the face-component detecting step, the line-of-sight discriminating including executing processing for discriminating a state of a line of sight of face image data of the face image from which the face component is detected, the state being one of a positive state in which the line of sight is directed in a camera direction and a negative state in which a line of sight is not directed in the camera direction, the processing being based on collation processing of the face image with respect to a line-of-sight discrimination dictionary in which learning data, including a plurality of face images classified in the positive state and a plurality of face images classified in the negative state, is stored;determining, by a warning output unit of the information processing apparatus, whether to output a warning signal based on a line-of-sight discrimination result from the line-of-sight discriminating; andoutputting the warning signal when a face image classified in the negative state in is included in a line of sight of a face included in an acquired image of the imaging apparatus. 19. The information processing method according to claim 18, wherein the face-component detecting detects eyes, a nose, and a mouth from the face area detected in the face detecting step, andthe line-of-sight discriminating executes the line-of-sight discrimination processing for the face image data from which the eyes, the nose, and the mouth are detected. 20. The information processing method according to claim 18, wherein the line-of-sight discriminating executes the line-of-sight discrimination processing according to an algorithm that uses a difference value, of a pixel pair in a specific position of an image, as comparison and collation data. 21. The information processing method according to claim 18, further comprising executing rotation processing, size normalization processing, and face-area slicing processing for the face image from which the face component is detected in the face-component detecting, wherein the line-of-sight discriminating inputs the face image processed in the image processing and executing the line-of-sight discrimination processing. 22. The information processing method according to claim 18, wherein the face detecting executes, with reference to a face detection dictionary in which different types of face area image information are stored, and face-area detection processing of the face area included in the image data using an algorithm that uses a difference value, of a pixel pair in a specific position of an image, as comparison and collation data. 23. The information processing method according to claim 18, wherein the face-component detecting executes, with reference to a face component detection dictionary in which different types of face component image information are stored, and face-component detection processing for detecting the face component from the face area detected in the face detecting using an algorithm that uses a difference value, of a pixel pair in a specific position of an image, as comparison and collation data. 24. The information processing method according to claim 18, further comprising: executing sound recognition processing; andcombining a line-of-sight discrimination result and a sound recognition result in the sound recognizing and executing analysis of a speaker. 25. The information processing method according to claim 18, wherein the information processing apparatus is an imaging apparatus,the information processing method further includes inputting a line-of-sight discrimination result in the line-of-sight discriminating and executing shutter control for the imaging apparatus, andthe shutter control includes performing control for disabling a shutter operation when a face image classified in the negative state is included in a line of sight of a face included in an acquired image of the imaging apparatus. 26. The information processing method according to claim 18, further comprising: inputting a line-of-sight discrimination result corresponding to plural image frames discriminated in the line-of-sight discriminating and performing selection processing for the image data, wherein the frame selecting includes selecting an image frame from the image data based on whether the line-of-sight discrimination result specifies the positive state or the negative state, for each of the plural image frames. 27. The information processing method according to claim 18, further comprising applying a line-of-sight discrimination result in the line-of-sight discriminating to data retrieval processing and executing retrieval processing for selecting and extracting an image that is in one of the positive state in which the line of sight is directed in the camera direction and the negative state in which the line of sight is not directed in the camera direction. 28. The information processing method according to claim 18, further comprising performing processing to store a line-of-sight discrimination processing result of the image data in a storing unit as correspondence information, with the image data. 29. The information processing method according to claim 18, further comprising: performing, by the information processing apparatus, an image rotation correction using a positional relationship among eyes, a nose, and a mouth in the face image. 30. An information processing apparatus, comprising: a face detecting unit, implemented on a hardware processor of the information processing apparatus, that detects a face area included in image data;a face-component detecting unit that detects a face component from the face area detected by the face detecting unit; anda line-of-sight discriminating unit that executes line-of-sight discrimination processing for a face image from which the face component is detected, whereinthe line-of-sight discriminating unit executes processing for discriminating a state of a line of sight of face image data of the face image from which the face component is detected, the state being one of a positive state in which the line of sight is directed in a camera direction and a negative state in which the line of sight is not directed in the camera direction, the processing being based on collation processing of the face image data with respect to a line-of-sight discrimination dictionary in which learning data, including a plurality of face images classified in the positive state and a plurality of face images classified in the negative state, is stored, anda warning output unit determines whether to output a warning signal based on a line-of-sight discrimination result from the line-of-sight discriminating unitwherein a different type of the line-of-sight discrimination dictionary is used based on the species of a subject, in the image data, corresponding to the face image, and wherein the different type of a line-of-sight discrimination dictionary includes a plurality of images of subjects of the same species as the subject classified in the positive state and a plurality of images of subjects of the same specifies as the subject classified in the negative state. 31. An imaging apparatus, comprising: a face detecting unit, implemented on a hardware processor of the imaging apparatus, that detects a face area included in image data;a face-component detecting unit that detects a face component from the face area detected by the face detecting unit; anda line-of-sight discriminating unit that executes line-of-sight discrimination processing for a face image from which the face component is detected, the line-of-sight discriminating unit executing processing for discriminating a state of a line of sight of face image data of the face image from which the face component is detected, the state being one of a positive state in which the line of sight is directed in a camera direction and a negative state in which the line of sight is not directed in the camera direction, the processing being based on collation processing of the face image data with respect to a line-of-sight discrimination dictionary in which learning data, including classification data corresponding to the positive and negative states, is stored;a shutter control unit that executes a shutter control operation for the imaging apparatus based on a line-of-sight discrimination result, the shutter control operation disabling the shutter control unit when the line-of-sight discrimination result indicates that at least a portion of the face image data is classified in the negative state.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (8)
Aoyama Keisuke,JPX, Camera having a line-of-sight detecting means.
Kakii, Toshiaki, Two-way interactive system, terminal equipment and image pickup apparatus having mechanism for matching lines of sight between interlocutors through transmission means.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.