최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
DataON 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Edison 바로가기다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
Kafe 바로가기국가/구분 | United States(US) Patent 등록 |
---|---|
국제특허분류(IPC7판) |
|
출원번호 | UP-0805392 (2004-03-22) |
등록번호 | US-7593552 (2009-10-20) |
우선권정보 | JP-2003-096271(2003-03-31); JP-2003-096520(2003-03-31) |
발명자 / 주소 |
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 | 피인용 횟수 : 376 인용 특허 : 4 |
A gesture recognition apparatus for recognizing postures or gestures of an object person based on images of the object person captured by cameras. The gesture recognition apparatus includes: a face/fingertip position detection means which detects a face position and a fingertip position of the objec
A gesture recognition apparatus for recognizing postures or gestures of an object person based on images of the object person captured by cameras. The gesture recognition apparatus includes: a face/fingertip position detection means which detects a face position and a fingertip position of the object person in three-dimensional space based on contour information and human skin region information of the object person to be produced by the images captured; and a posture/gesture recognition means which operates to detect changes of the fingertip position by a predetermined method, to process the detected results by a previously stored method, to determine a posture or a gesture of the object person, and to recognize a posture or a gesture of the object person.
What is claimed is: 1. A gesture recognition apparatus for recognizing postures or gestures of an object person based on images of the object person captured by cameras, comprising: a three dimensional face and fingertip position detection means for detecting a face position and a fingertip positio
What is claimed is: 1. A gesture recognition apparatus for recognizing postures or gestures of an object person based on images of the object person captured by cameras, comprising: a three dimensional face and fingertip position detection means for detecting a face position and a fingertip position of the object person in three-dimensional space based on three dimensional information to be produced by the images captured, wherein the three dimensional information comprises contour information, human skin region information, and distance information of the object person with respect to the cameras based on a parallax of the images captured by the cameras; and a posture or gesture recognition means which operates to detect changes of the fingertip position by detecting a relative position between the face position and the fingertip position and changes of the fingertip position relative to the face position, to process the detected changes by comparing the detected changes with posture data or gesture data previously stored, to determine whether the detected changes are a posture or a gesture of the object person, and to recognize a posture or a gesture of the object person, wherein the posture or gesture recognition means sets in the detected changes a determination region with a sufficient size for a hand of the object person, and compares an area of the hand with an area of the determination region to determine whether the compared detected changes corresponds to the gesture data or the posture data, thereby to distinguish postures from gestures, which are similar in relative position between the face position and the fingertip position. 2. A gesture recognition apparatus according to claim 1, wherein the relative position between the face position and the fingertip position is detected by comparing heights thereof and distances thereof from the cameras. 3. A gesture recognition apparatus according to claim 1, wherein the posture or gesture recognition means recognizes postures or gestures of the object person by means of pattern matching. 4. A gesture recognition apparatus according to claim 1, wherein the predetermined method is to calculate a feature vector from an average and variance of a predetermined number of frames for an arm and hand position or a hand fingertip position, and the previously stored method is to calculate for all postures or gestures a probability density of posteriori distributions of each random variable based on the feature vector and by means of a statistical method so as to determine a posture or a gesture with a maximum probability density. 5. A gesture recognition apparatus according to claim 4, wherein the posture or gesture recognition means recognizes a posture or a gesture of the object person when a same posture or gesture is repeatedly recognized for a certain times or more in a certain number of frames. 6. A gesture recognition apparatus according to claim 1, wherein the three dimensional face and fingertip position detection means detects the fingertip position of the object person by detecting an arm or hand position based on the contour information and the human skin information; searching for top-bottom end points and right-left end points of human-skin region based on the detected arm or hand position and the human skin region information; comparing a vertical direction distance between the top-bottom end points and a horizontal direction distance between the right-left end points, and determining either the vertical distance or the horizontal distance that has a longer distance to be a direction where the fingertips extend, and determining which point of the top-bottom end points and the right-left end points is the fingertip position. 7. A gesture recognition method for recognizing postures or gestures of an object person based on images of the object person captured by cameras, comprising: at least one processor performing the steps of: a face and fingertip position detecting step for detecting a face position and a fingertip position of the object person in three-dimensional space based on three-dimensional information to be produced by the images captured, wherein the three-dimensional information comprises contour information, human skin region information, and the distance information of the object person with respect to the cameras based on a parallax of the images captured by the cameras; and a posture or gesture recognizing step for detecting changes of the fingertip position by detecting a relative position between the face position and the fingertip position and changes of the fingertip position relative to the face position, processing the detected changes by comparing the detected changes with posture data or gesture data previously stored, determining whether the detected changes are a posture or a gesture of the object person, and recognizing a posture or a gesture of the object person, wherein the posture or gesture recognizing step sets in the detected changes a determination region with a sufficient size for a hand of the object person, and compares an area of the hand with an area of the determination region to determine whether the compared detected changes corresponds to the gesture data or the posture data, thereby to distinguish postures from gestures, which are similar in relative position between the face position and the fingertip position. 8. A gesture recognition method according to claim 7, wherein the predetermined method is to calculate a feature vector from an average and variance of a predetermined number of frames for an arm and hand position or a hand fingertip position, and the previously stored method is to calculate for all postures or gestures a probability density of posteriori distributions of each random variable based on the feature vector and by means of a statistical method so as to determine a posture or a gesture with a maximum probability density. 9. A gesture recognition method according to claim 7, wherein the face and fingertip position detecting comprising detecting the fingertip position of the object person by detecting an arm or hand position based on the contour information and the human skin information; searching for top-bottom end points and right-left end points of human-skin region based on the detected arm or hand position and the human skin region information; comparing a vertical direction distance between the top-bottom end points and a horizontal direction distance between the right-left end points, and determining either the vertical distance or the horizontal distance that has a longer distance to be a direction where the fingertips extend, and determining which point of the top-bottom end points and the right-left end points is the fingertip position. 10. A computer program embodied on a computer readable medium, the computer readable medium storing code comprising computer executable instructions configured to perform a gesture recognition method for recognizing postures or gestures of an object person based on images of the object person captured by cameras, comprising: detecting a face and fingertip position comprising three-dimensional detecting of a face position and a fingertip position of the object person in three-dimensional space based on three-dimensional information to be produced by the images captured, wherein the three-dimensional information comprises contour information, human skin region information, and the distance information of the object person with respect to the cameras based on a parallax of the images captured by the cameras; and recognizing a posture or gesture, comprising detecting changes of the fingertip position by detecting a relative position between the face position and the fingertip position and changes of the fingertip position relative to the face position, processing the detected changes by comparing the detected changes with posture data or gesture data previously stored, determining whether the detected changes are a posture or a gesture of the object person, and recognizing the posture or a gesture of the object person, wherein the recognizing the posture or the gesture sets in the detected changes a determination region with a sufficient size for a hand of the object person, and compares an area of the hand with an area of the determination region to determine whether the compared, detected changes corresponds to the gesture data or the posture data, thereby to distinguish postures from gestures, which are similar in relative position between the face position and the fingertip position. 11. A computer program according to claim 10, wherein the detecting of changes of the fingertip position comprises calculating a feature vector from an average and variance of a predetermined number of frames for an arm and hand position or a hand fingertip position, and wherein the processing of the detected changes comprises calculating for all postures or gestures a probability density of posteriori distributions of each random variable based on the feature vector and by means of a statistical method so as to determine a posture or a gesture with a maximum probability density. 12. A computer program according to claim 10, wherein the detecting a face and fingertip position comprises detecting the fingertip position of the object person by detecting an arm or hand position based on the contour information and the human skin information; searching for top-bottom end points and right-left end points of human-skin region based on the detected arm or hand position and the human skin region information; comparing a vertical direction distance between the top-bottom end points and a horizontal direction distance between the right-left end points, and determining either the vertical distance or the horizontal distance that has a longer distance to be a direction where the fingertips extend, and determining which point of the top-bottom end points and the right-left end points is the fingertip position.
Copyright KISTI. All Rights Reserved.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.