Methods and systems for enabling direction detection when interfacing with a computer program
원문보기
IPC분류정보
국가/구분
United States(US) Patent
등록
국제특허분류(IPC7판)
G09G-005/00
A63F-013/06
H04N-005/225
출원번호
UP-0301673
(2005-12-12)
등록번호
US-7646372
(2010-02-22)
발명자
/ 주소
Marks, Richard L.
Deshpande, Hrishikesh R.
출원인 / 주소
Sony Computer Entertainment Inc.
대리인 / 주소
Martine Penilla & Gencarella, LLP.
인용정보
피인용 횟수 :
101인용 특허 :
171
초록▼
A method for detecting direction when interfacing with a computer program is provided. The method includes capturing an image presented in front of an image capture device. The image capture device has a capture location in a coordinate space. When a person is captured in the image, the method inclu
A method for detecting direction when interfacing with a computer program is provided. The method includes capturing an image presented in front of an image capture device. The image capture device has a capture location in a coordinate space. When a person is captured in the image, the method includes identifying a human head in the image and assigning the human head a head location in the coordinate space. The method also includes identifying an object held by the person in the image and assigning the object an object location in coordinate space. The method further includes identifying a relative position in coordinate space between the head location and the object location when viewed from the capture location. The relative position defines a pointing direction of the object when viewed by the image capture device. The method may be practiced on a computer system, such as one used in the gaming field.
대표청구항▼
What is claimed is: 1. A computer implemented method for detecting direction when interfacing with a computer program, comprising: (a) capturing an image presented in front of image capture device, the image capture device having a capture location in a coordinate space and the image includes a per
What is claimed is: 1. A computer implemented method for detecting direction when interfacing with a computer program, comprising: (a) capturing an image presented in front of image capture device, the image capture device having a capture location in a coordinate space and the image includes a person; (b) identifying a human head of the person in the image and assigning the human head a head location in the coordinate space; (c) identifying an object held by the person in the image and assigning the object an object location in coordinate space; (d) identifying a relative position in coordinate space between the head location and the object location when viewed from the capture location, wherein the relative position defines a pointing direction of the object when viewed by the image capture device, wherein the relative position is identified by computing an azimuth angle and an altitude angle between the head location and the object location in relation to the capture location; and (e) displaying the pointing direction of the object on a display screen. 2. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 1, wherein the capture location is at a proximate location of the display screen and the display screen is capable of rendering interactive graphics. 3. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 2, wherein the pointing direction is toward the display screen. 4. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 2, further comprising: repeating (a)-(e) continually to update the pointing direction. 5. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 4, further comprising: enabling selection of particular interactive graphics using the displayed pointing direction. 6. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 5, wherein the selection is in response to a detected trigger event. 7. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 6, wherein the detected trigger event is identified in the image, the identification comprising, identifying a first characteristic of the object held by the person at a first point in time; and identifying a second characteristic of the object held by the person at a second point in time, wherein the trigger event is activated when a degree of difference is determined to have existed between first characteristic and the second characteristic when examined between the first point in time and the second point in time. 8. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 7, wherein the trigger even being activated is indicative of interactivity with the interactive graphics. 9. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 8, wherein the Interactivity can include one or more of selection of a graphic, shooting of a graphic, touching a graphic, moving of a graphic, activation of a graphic, triggering of a graphic and acting upon or with a graphic. 10. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 1, wherein identifying the human head is processed using template matching in combination with face detection code. 11. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 1, wherein identifying the object held by the person is facilitated by color tracking of a portion of the object. 12. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 11, wherein color tracking includes one or a combination of identifying differences in colors and identifying on/off states of colors. 13. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 4, wherein identifying the object held by the person is facilitated by identification of changes in positions of the object when repeating (a)-(e). 14. The computer implemented method for detecting direction when interfacing with the computer program as recited in claim 1, wherein the computer program is a video game. 15. A computer implemented method for detecting pointing direction of an object directed toward a display screen that can render graphics of a computer program, comprising: (a) capturing an image presented in front of an image capture device, the image capture device having a capture location in a coordinate space that is proximate to the display screen and the image includes a person; (b) identifying a first body part of the person in the image and assigning the first body part a first location in the coordinate space; (c) identifying a second body part of the person in the image and assigning the second body part a second location in coordinate space; (d) identifying a relative position in coordinate space between the first location and the second location when viewed from the capture location, wherein the relative position defines a pointing direction of the second body part when viewed by the image capture device at the capture location that is proximate to the display screen, wherein the relative position is identified by computing an azimuth angle and an altitude angle between the first location and the second location in relation to the capture location; and (e) displaying the pointing direction of the object on the display screen. 16. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 15, wherein the first body part is a human head and the second body part is a human hand. 17. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 15, wherein (a)-(e) is repeated continually during execution of the computer program, and examining a shape of the human hand during the repeating of (a)-(e) to determine particular shape changes. 18. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 15, wherein particular shape changes trigger interactivity with interactive graphics of the computer program. 19. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 18, wherein the interactivity can include one or more of selection of a graphic, shooting of a graphic, touching a graphic, moving of a graphic, activation of a graphic, triggering of a graphic and acting upon or with a graphic. 20. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 15, wherein the second body part is identified by way of an object held by the human hand. 21. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 15, wherein the object includes color. 22. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 21, wherein the color is capable of switching between states to trigger interactivity with interactive graphics of the computer program. 23. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 22, wherein additional colors are present on the object, the colors capable of being switched to trigger interactivity with interactive graphics of the computer program. 24. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 21, wherein the color can switch from on/off states to trigger interactivity with interactive graphics of the computer program. 25. The computer implemented method for detecting pointing direction of the object directed toward the display screen that can render graphics of the computer program as recited in claim 15, wherein the computer program is a video game.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (171)
Cipolla Roberto (Cambridge GBX) Okamoto Yasukazu (Chiba-ken JPX) Kuno Yoshinori (Osaka-fu JPX), 3D human interface apparatus using motion recognition based on dynamic image processing.
Nobuo Fukushima JP; Tomotaka Muramoto JP; Masayoshi Sekine JP, Display apparatus which detects an observer body part motion in correspondence to a displayed element used to input operation instructions to start a process.
Tamura Akihiro (Yawata JPX) Sakaue Shigeo (Moriguchi JPX), Gradation correction device and image sensing device therewith for supplying images with good gradation for both front-l.
Cartabiano Michael C. ; Curran Kenneth J. ; Dick David J. ; Gibbs Douglas R. ; Kirby Morgan H. ; May Richard L. ; Storer William J. A. ; Ullman Adam N., Hand-attachable controller with direction sensing.
Sata Hironori,JPX, Image generating system and information storage medium capable of changing angle of view of virtual camera based on object positional information.
Wallace,Jon K.; Luo,Yun; Dziadula,Robert; Khairallah,Farid, Method and apparatus for determining an occupant's head location in an actuatable occupant restraining system.
Florent Raoul (Valenton FRX) Lelong Pierre (Nogent-Sur-marne FRX), Method and device for processing an image in order to construct from a source image a target image with charge of perspe.
Maes Pattie E. (Somerville MA) Blumberg Bruce M. (Pepperell MA) Darrell Trevor J. (Cambridge MA) Starner Thad E. (Somerville MA) Johnson Michael P. (Cambridge MA) Russell Kenneth B. (Boston MA) Pentl, Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual e.
Matey James R. (Mercerville NJ) Aceti John G. (Cranbury NJ) Pletcher Timothy A. (East Hampton NJ), Method and system for object detection for instrument control.
Wergen, Gerhard; Franz, Klaus, Method for transferring characters especially to a computer and an input device which functions according to this method.
Kobayashi Hiroshi (3-15 Hanakoganei Kodaira-shi ; Tokyo JPX) Machida Haruhiko (10-7 Nakaochiai 4-chome Shinjuki-ku ; Tokyo JPX) Ema Hideaki (Shizuoka JPX) Akedo Jun (Tokyo JPX), Method of measuring the amount of movement of an object having uniformly periodic structure.
Everett ; Jr. Hobart R. (San Diego CA) Gilbreath Gary A. (San Diego CA) Laird Robin T. (San Diego CA), Navigational control system for an autonomous vehicle.
Elko Gary W. (Summit NJ) Sondhi Man M. (Berkeley Heights NJ) West James E. (Plainfield NJ), Noise reduction processing arrangement for microphone arrays.
Levine, Bruce M.; Wirth, Allan; Knowles, C. Harry, OPHTHALMIC INSTRUMENT WITH ADAPTIVE OPTIC SUBSYSTEM THAT MEASURES ABERRATIONS (INCLUDING HIGHER ORDER ABERRATIONS) OF A HUMAN EYE AND THAT PROVIDES A VIEW OF COMPENSATION OF SUCH ABERRATIONS TO THE H.
Podoleanu, Adrian Gh.; Jackson, David A.; Rogers, John A.; Dobre, George M.; Cucu, Radu G., Optical mapping apparatus with adjustable depth resolution and multiple functionality.
Lake Royden J. (Armidale AUX) Moore John C. (Armidale AUX) Kowald Errol M. (Armidale AUX) Doerr Annegret (Armidale AUX), Optically readable coded target.
Marks, Richard L., Prop input device and method for mapping an object from a two-dimensional camera image to a three-dimensional space for controlling action in a game program.
Krueger Myron W. (55 Edith Rd. Vernon CT 06066) Hinrichsen Katrin (81 Willington Oaks Storrs CT 06268) Gionfriddo Thomas S. (81 Willington Oaks Storrs CT 06268), Real time perception of and response to the actions of an unencumbered participant/user.
Addeo Eric J. (Long Valley NJ) Robbins John D. (Denville NJ) Shtirmer Gennady (Morris Plains NJ), Sound localization system for teleconferencing using self-steering microphone arrays.
Chang Bay-Wei W. ; Fishkin Kenneth P. ; Harrison Beverly L. ; Igarashi Takeo,JPX ; Mackinlay Jock D. ; Want Roy ; Zellweger Polle T., Spinning as a morpheme for a physical manipulatory grammar.
Lyons Damian M., System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs.
Stam, Joseph S.; Bechtel, Jon H.; Reese, Spencer D.; Roberts, John K.; Tonar, William L.; Poe, G. Bruce; Newhouse, Douglas J., System for controlling exterior vehicle lights.
Stam, Joseph S.; Bechtel, Jon H.; Reese, Spencer D.; Roberts, John K.; Tonar, William L.; Poe, G. Bruce; Newhouse, Douglas J., System for controlling exterior vehicle lights.
Wang John Y. A. (Cambridge MA) Adelson Edward H. (Cambridge MA), System for encoding image data into multiple layers representing regions of coherent motion and associated motion parame.
Freeman William T. ; Leventon Michael E., System for reconstructing the 3-dimensional motions of a human figure from a monocularly-viewed image sequence.
Sawano, Takao; Matsuoka, Hirofumi; Endo, Takashi, Video game system for capturing images and applying the captured images to animated game play characters.
Fishkin Kenneth P. ; Goldberg David ; Gujar Anuj Uday ; Harrison Beverly L. ; Mynatt Elizabeth D. ; Stone Maureen C. ; Want Roy, Zoomorphic computer user interface.
Hunleth, Frank A.; Moshiri, Negar; Napier, William J.; Simpkins, Daniel S.; Wroblewski, Frank J., Control framework with a zoomable graphical user interface for organizing selecting and launching media items.
Zalewski, Gary M.; Marks, Richard; Mao, Xiadong, Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera.
Zalewski, Gary M.; Marks, Richard; Mao, Xiaodong, Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera.
Scott, Steven J.; Nguyen, Thong T.; Vasko, David A.; Gasperi, Michael; Brandt, David D., Recognition-based industrial automation control with confidence-based decision support.
Scott, Steven J.; Nguyen, Thong T.; Brandt, David D.; Gibart, Tony; Dotson, Gary D., Recognition-based industrial automation control with person and object discrimination.
Scott, Steven J.; Nguyen, Thong T.; Brandt, David D.; Gibart, Tony; Dotson, Gary D., Recognition-based industrial automation control with person and object discrimination.
Scott, Steven J.; Nguyen, Thong T.; Vasko, David A.; Wishart, Marco; Tidwell, Travis, Recognition-based industrial automation control with position and derivative decision reference.
Scott, Steven J.; Nguyen, Thong T.; Nair, Suresh; Roback, Timothy; Brandt, David D., Recognition-based industrial automation control with redundant system input support.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.